Accelerate PyTorch workloads with Cloud TPUs and OpenXLA

1 342
32
Следующее
Популярные
37 дней – 1 6787:12
Demo: Gemma on-device with MediaPipe
Опубликовано 16 мая 2024, 14:08
Google Cloud AI accelerators enable high-performance, cost-effective training and inference for leading AI/ML frameworks. In this session, get the latest news on PyTorch/XLA, developed collaboratively by Google, Meta, and partners in the AI ecosystem to utilize OpenXLA for accelerating PyTorch workloads.

Github → goo.gle/3xQeFaC
Cloud Tensor Processing Units (TPUs) → goo.gle/tpu

Speakers: Shauheen Zahirazami

Watch more:
Check out all the AI videos at Google I/O 2024 → goo.gle/io24-ai-yt
Check out all the Cloud videos at Google I/O 2024 → goo.gle/io24-cloud-yt
Check out all the Mobile videos at Google I/O 2024 → goo.gle/io24-mobile-yt
Check out all the Web videos at Google I/O 2024 → goo.gle/io24-web-yt

Subscribe to Google Developers → goo.gle/developers

#GoogleIO

Products Mentioned: Cloud - AI and Machine Learning - Cloud TPU
Event: Google I/O 2024
автотехномузыкадетское