Microsoft Research333 тыс
Опубликовано 28 октября 2020, 14:00
We introduce a way to enable more natural interaction between humans and robots through Mixed Reality, by using a shared coordinate system. Azure Spatial Anchors, which already supports colocalizing multiple HoloLens and smartphone devices in the same space, has now been extended to support robots equipped with cameras.
This allows humans and robots sharing the same space to interact naturally: humans can see the plan and intention of the robot, while the robot can interpret commands given from the person's perspective. We hope that this can be a building block in the future of humans and robots being collaborators and coworkers.
See more at microsoft.com/en-us/research/v...
Check out the code at aka.ms/ASALinuxSDK
This allows humans and robots sharing the same space to interact naturally: humans can see the plan and intention of the robot, while the robot can interpret commands given from the person's perspective. We hope that this can be a building block in the future of humans and robots being collaborators and coworkers.
See more at microsoft.com/en-us/research/v...
Check out the code at aka.ms/ASALinuxSDK
Случайные видео