Machine Learning for Embodied Design in Virtual Reality

986
27.4
Опубликовано 19 апреля 2017, 23:16
Much of my research has tried to create virtual characters that are able to interact with real people via body language in immersive virtual reality. While some very simple models can create impressive effects and a strong sense of being with another person, these models fail to capture the complexity of human behaviour and it can be hard to design the mannerisms of a particular character. The key problem is that body language is a unconscious, tacit behaviour: we all do it but we don't really know what we do and how we do it. Machine learning makes it possible to design behaviour by giving examples (by doing) rather than by creating rules, which requires knowing what we do. In particular using machine learning interactively can enable what I call "embodied design", where we design through our embodied behaviour. This talk will discuss some of the challenges, particularly with debugging, and end by trying to understand the role of interactive machine learning through an analogy with how humans learn and teach tacit, embodied skills

See more on this video at microsoft.com/en-us/research/v...
автотехномузыкадетское