Unfamiliar Territory? LLM-based Road Rules Guide Simplifies Driving - DRIVE Labs Ep. 34

14 867
15.9
NVIDIA2.17 млн
Следующее
Популярные
309 дней – 252 2180:58
NVIDIA Research Breakthroughs at CVPR 2025
Опубликовано 23 мая 2024, 17:04
Adapting driving behavior to new environments, customs, and laws is a long-standing challenge in autonomous driving. LLaDA (Large Language Driving Assistant) is an #LLM network that makes it easier to navigate in unfamiliar places by providing real-time guidance on regional traffic rules in different languages, for both human drivers and #autonomousvehicles. LLaDA will be powered by NVIDIA DRIVE Thor, which harnesses the new #generativeAI capabilities of NVIDIA’s Blackwell GPU architecture.

00:00:12 - Introducing LLaDA
00:00:36 - Multimodal neural interfaces integrate diverse types of data
00:00:53 - LLaDA can rapidly adapt to local traffic rules and customs
00:01:30 - The NVIDIA Riva speech SDK can process different languages
00:01:58 - LLaDA can also be applied to AV motion planning
00:02:28 - Accelerated by NVIDIA DRIVE Thor, built on the NVIDIA Blackwell architecture
00:03:02 - Visit our GitHub page and check out the LLaDA paper at CVPR 2024

Project page: boyiliee.github.io/llada
Paper: arxiv.org/abs/2402.05932
Watch the full series here: nvda.ws/3LsSgnH
Learn more about DRIVE Labs: nvda.ws/36r5c6t
Follow us on social:
Twitter: nvda.ws/3LRdkSs
LinkedIn: nvda.ws/3wI4kue
#NVIDIADRIVE
Случайные видео
26.11.24 – 3 8830:17
How many bugs did I push again?
05.11.22 – 2 537 94615:02
Nvidia is Clearly Better, Right?
21.05.22 – 647 1340:15
My desk IS my PC cooler!
автотехномузыкадетское