Easy and affordable access to GPUs for AI/ML workloads

1 968
14.6
Следующее
Популярные
36 дней – 5 1116:13
Build RAG apps with embeddings
Опубликовано 25 апреля 2024, 16:00
The growth in AI/ML training, fine tuning, and inference workloads has created exponential demand for GPU capacity, making accelerators a scarce resource. Join Debi Cabrera as she chats with Product Managers at Google, Laura Ionita and Ari Liberman, to discuss how Dynamic Workload Scheduler (DWS) works, Compute Engine consumption models, and more. Watch along and learn how to get started today!

Chapters:
0:00 - Meet Laura and Ari
1:04 - What is Dynamic Workload Scheduler?
3:21 - Which workloads function with Dynamic Workload Scheduler?
4:59 - How to choose between Compute Engine models
6:42 - Combining different Compute Engine models
8:32 - Real world examples
10:37 - Get started with Dynamic Workload Scheduler
11:20 - Wrap up

Resources:
Watch the full session here → goo.gle/49K98Qi
Introducing Dynamic Workload Scheduler → goo.gle/3Jn3oB0

Watch more Cloud Next 2024 → goo.gle/Next-24
Subscribe to Google Cloud Tech → goo.gle/GoogleCloudTech

#GoogleCloudNext #GoogleGemini

Event: Google Cloud Next 2024
Speakers: Debi Cabrera, Laura Ionita, Ari Liberman
Products Mentioned: Google Compute Engine, Dynamic Workload Scheduler
Свежие видео
7 дней – 1 3410:43
What is an AI agent?
8 дней – 9120:05
When you've been in VR for hours
14 дней – 8 5860:23
Xiaomi Pad 7 Pro Keyboard is AWESOME!
автотехномузыкадетское