Multimodal Gaze-Supported Interaction

135
Следующее
08.08.16 – 4729:31
Research Next Keynote
Популярные
Опубликовано 8 августа 2016, 23:09
While our eye gaze represents an important medium for perceiving our environment, it also serves as a fast and implicit way for signaling interest in somebody or something. This could also benefit a flexible and convenient interaction with diverse computing systems ranging from small handheld devices to multiple large-sized screens. Considerable research has already been pursued on gaze-only interaction, which is however often described as error-prone, imprecise, and unnatural. To overcome these challenges, multimodal combinations of gaze with additional input modalities show a high potential for fast, fluent, and convenient human-computer interaction in diverse user contexts. Promising examples for this novel style of multimodal gaze-supported interaction are the seamless selection and manipulation of graphical objects displayed on distant screens by using a combination of a mobile handheld (such as a smartphone) and gaze input. In my talk, I will provide a brief introduction to gaze-based interaction in general and present insights into my research at the Interactive Media Lab. Thereby, I will particularly emphasize the high potential of the emerging area of multimodal gaze-supported interaction.
Случайные видео
09.05.20 – 38 5913:58
Jupyter Notebooks
15.05.09 – 2 7503:24
[RTW] Tech News for May 15th
автотехномузыкадетское