Pre-training foundation models on Amazon SageMaker | Step 3: Distributed training

335
37.2
Опубликовано 24 июля 2024, 15:34
Amazon SageMaker helps you reduce the time and cost of training foundation models (FMs) at scale without managing infrastructure. This video series will provide step-by-step guidance on training FMs from scratch on SageMaker.
SageMaker distributed training libraries can automatically split large models and training datasets across AWS GPU instances. In this video, you will learn how to run high-performance distributed training using optimized Pytorch Fully Sharded Data Parallel(FSDP) and libraries on SageMaker.

Learn more at: go.aws/3Vgd31M

Subscribe:
More AWS videos: go.aws/3m5yEMW
More AWS events videos: go.aws/3ZHq4BK

Do you have technical AWS questions?
Ask the community of experts on AWS re:Post: go.aws/3lPaoPb

ABOUT AWS
Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers — including the fastest-growing startups, largest enterprises, and leading government agencies — are using AWS to lower costs, become more agile, and innovate faster.

#AWS #AmazonWebServices #CloudComputing
автотехномузыкадетское