Moving your Spark and Hadoop workloads to Google Cloud Platform (Google Cloud Next '17)

11 528
44.7
Опубликовано 9 марта 2017, 22:33
For those who use popular open source data processing tools like Apache Spark and Hadoop, there can be several steps involved before being able to really focus on the data itself: creating a cluster to use the open source tools, finding a software package to easily install and manage the tools, then finding people to create jobs and applications and to operate, maintain and scale your clusters. In this video, you'll learn how you can use Google Cloud's managed Spark and Hadoop service, Google Cloud Dataproc, to take advantage of your existing investments in Spark and Hadoop. You'll have the opportunity to see how easily your existing data and code can be migrated to Google Cloud Platform (GCP) and how within Cloud Dataproc clusters can be right-sized, run ephemerally and cleanly separated to maximize your invested resources.

Missed the conference? Watch all the talks here: goo.gl/c1Vs3h
Watch more talks about Big Data & Machine Learning here: goo.gl/OcqI9k
автотехномузыкадетское