Video Abstract: Human-Robot Collaboration

620
20.7
Опубликовано 28 июня 2017, 16:14
This demonstration uses Softbank’s Pepper robot as testbed hardware to show a set of human-collaboration activities based on Microsoft Cognitive Services and other Microsoft Research technologies.

As both a research and prototype-engineering effort, this project is designed to implement software technology and learn from concepts such as Brooks’ subsumption architecture, which distributes the brain activities of the robot between the local device for reflex functions, the local facility infrastructure for recognition functions, and remote API services hosted in the cloud for cognitive functions. This implementation is designed to be machine-independent and relevant to all robots requiring human-collaboration capabilities. This approach has supported new investigations such as non-verbal communication and body movements expressed and documented using Labanotation, making it possible for a robot to process conversations with humans and automatically generate life-like and meaningful physical behaviors to accompany its spoken words.

See more on this video at microsoft.com/en-us/research/e...
Свежие видео
8 дней – 2 7574:50
Semantic modeling for AI
11 дней – 24 1015:04
AMD Advancing AI 2024 Highlights
12 дней – 3 2765:48
Google Trends for Researchers
автотехномузыкадетское