Google12.7 млн
Опубликовано 14 мая 2024, 17:26
Introducing Project Astra. We created a demo in which a tester interacts with a prototype of AI agents supported by our multimodal foundation model, Gemini.
There are two continuous takes: one with the prototype running on a Google Pixel phone and another on a prototype glasses device.
The agent takes in a constant stream of audio and video input. It can reason about its environment in real time and interact with the tester in a conversation about what it is seeing.
Learn more about Project Astra: goo.gle/3wAUwFh
#GoogleIO2024
Watch the full Google I/O 2024 keynote: youtube.com/live/XEzRZ35urlk?s...
To watch this keynote with American Sign Language (ASL) interpretation, please click here: youtube.com/live/6rP2rEWsfpM?s...
#GoogleIO
Subscribe to our Channel: youtube.com/google
Find us on X: twitter.com/google
Watch us on TikTok: tiktok.com/@google
Follow us on Instagram: instagram.com/google
Join us on Facebook: facebook.com/Google
There are two continuous takes: one with the prototype running on a Google Pixel phone and another on a prototype glasses device.
The agent takes in a constant stream of audio and video input. It can reason about its environment in real time and interact with the tester in a conversation about what it is seeing.
Learn more about Project Astra: goo.gle/3wAUwFh
#GoogleIO2024
Watch the full Google I/O 2024 keynote: youtube.com/live/XEzRZ35urlk?s...
To watch this keynote with American Sign Language (ASL) interpretation, please click here: youtube.com/live/6rP2rEWsfpM?s...
#GoogleIO
Subscribe to our Channel: youtube.com/google
Find us on X: twitter.com/google
Watch us on TikTok: tiktok.com/@google
Follow us on Instagram: instagram.com/google
Join us on Facebook: facebook.com/Google
Свежие видео