Multimodal Gaze-Supported Interaction

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=O8gTJOIV5gA



Duration: 54:41
121 views
1


While our eye gaze represents an important medium for perceiving our environment, it also serves as a fast and implicit way for signaling interest in somebody or something. This could also benefit a flexible and convenient interaction with diverse computing systems ranging from small handheld devices to multiple large-sized screens. Considerable research has already been pursued on gaze-only interaction, which is however often described as error-prone, imprecise, and unnatural. To overcome these challenges, multimodal combinations of gaze with additional input modalities show a high potential for fast, fluent, and convenient human-computer interaction in diverse user contexts. Promising examples for this novel style of multimodal gaze-supported interaction are the seamless selection and manipulation of graphical objects displayed on distant screens by using a combination of a mobile handheld (such as a smartphone) and gaze input. In my talk, I will provide a brief introduction to gaze-based interaction in general and present insights into my research at the Interactive Media Lab. Thereby, I will particularly emphasize the high potential of the emerging area of multimodal gaze-supported interaction.







Tags:
microsoft research