Petabyte-scale data migrations the Google Cloud way

543
12.1
Следующее
Популярные
26 дней – 7 7884:22
Google Axion Processors, explained
Опубликовано 19 декабря 2023, 19:18
Learn how to easily migrate a petabyte-scale data pipeline from a legacy platform solution to Google Cloud. We will present a real-world data pipeline case covering best practices in technology, such as processing time, pipeline optimization, and performance.
In addition to the technology aspects, the attention to people, change management, team structure and data governance were also vital to amplify the business impact such expressive cost reduction.

You will hear from Globo, the largest media group in Latin America, who supports millions of concurrent users for events like Big Brother Brasil and the World Cup. Globo is on track to migrate petabytes of data to Google Cloud and is effectively using Dataform to do things bigger, better, faster, and cheaper with the help of Google Cloud Partner Thoughtworks. Data migration was the first crucial step to create great value for Globo, opening up new possibilities.

Speakers: Manjul Sahay, Guilherme Silveira, Guilherme Balestieri Bedin

Watch more:
All sessions from Google Cloud Next → goo.gle/next23

#GoogleCloudNext

ARC214
Свежие видео
4 дня – 1 3511:24
Passkeys #SpotlightWeek
4 дня – 7711:53
Preview the Workspace
10 дней – 53 42525:51
Tailscale: Networking Magic!
автотехномузыкадетское