Unfamiliar Territory? LLM-based Road Rules Guide Simplifies Driving - DRIVE Labs Ep. 34

12 372
13.6
NVIDIA1.79 млн
Следующее
Популярные
146 дней – 147 2952:03
Advancing Humanoid Robot Development
Опубликовано 23 мая 2024, 17:04
Adapting driving behavior to new environments, customs, and laws is a long-standing challenge in autonomous driving. LLaDA (Large Language Driving Assistant) is an #LLM network that makes it easier to navigate in unfamiliar places by providing real-time guidance on regional traffic rules in different languages, for both human drivers and #autonomousvehicles. LLaDA will be powered by NVIDIA DRIVE Thor, which harnesses the new #generativeAI capabilities of NVIDIA’s Blackwell GPU architecture.

00:00:12 - Introducing LLaDA
00:00:36 - Multimodal neural interfaces integrate diverse types of data
00:00:53 - LLaDA can rapidly adapt to local traffic rules and customs
00:01:30 - The NVIDIA Riva speech SDK can process different languages
00:01:58 - LLaDA can also be applied to AV motion planning
00:02:28 - Accelerated by NVIDIA DRIVE Thor, built on the NVIDIA Blackwell architecture
00:03:02 - Visit our GitHub page and check out the LLaDA paper at CVPR 2024

Project page: boyiliee.github.io/llada
Paper: arxiv.org/abs/2402.05932
Watch the full series here: nvda.ws/3LsSgnH
Learn more about DRIVE Labs: nvda.ws/36r5c6t
Follow us on social:
Twitter: nvda.ws/3LRdkSs
LinkedIn: nvda.ws/3wI4kue
#NVIDIADRIVE
автотехномузыкадетское