Easy and affordable access to GPUs for AI/ML workloads

2 011
15.2
Следующее
Популярные
73 дня – 6 3684:24
Visualize data with Looker Studio
Опубликовано 25 апреля 2024, 16:00
The growth in AI/ML training, fine tuning, and inference workloads has created exponential demand for GPU capacity, making accelerators a scarce resource. Join Debi Cabrera as she chats with Product Managers at Google, Laura Ionita and Ari Liberman, to discuss how Dynamic Workload Scheduler (DWS) works, Compute Engine consumption models, and more. Watch along and learn how to get started today!

Chapters:
0:00 - Meet Laura and Ari
1:04 - What is Dynamic Workload Scheduler?
3:21 - Which workloads function with Dynamic Workload Scheduler?
4:59 - How to choose between Compute Engine models
6:42 - Combining different Compute Engine models
8:32 - Real world examples
10:37 - Get started with Dynamic Workload Scheduler
11:20 - Wrap up

Resources:
Watch the full session here → goo.gle/49K98Qi
Introducing Dynamic Workload Scheduler → goo.gle/3Jn3oB0

Watch more Cloud Next 2024 → goo.gle/Next-24
Subscribe to Google Cloud Tech → goo.gle/GoogleCloudTech

#GoogleCloudNext #GoogleGemini

Event: Google Cloud Next 2024
Speakers: Debi Cabrera, Laura Ionita, Ari Liberman
Products Mentioned: Google Compute Engine, Dynamic Workload Scheduler
Свежие видео
5 дней – 3043:54
Visual QnA | Intel
7 дней – 398 8769:26
Six Months Later! Ecovacs Deebot T30S
16 дней – 018:58
Meet the Team - Tatjana Devic
автотехномузыкадетское