OmniTouch: Wearable Multitouch Interaction Everywhere

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=V7XKp8cq9Sc



Duration: 3:27
2,140 views
27


OmniTouch is a wearable depth-sensing and projection system that enables interactive multitouch applications on everyday surfaces. Beyond the shoulder-worn system, there is no instrumentation of the user or environment. Foremost, the system allows the wearer to use their hands, arms and legs as graphical, interactive surfaces. Users can also transiently appropriate surfaces from the environment to expand the interactive area (e.g., books, walls, tables). On such surfaces - without any calibration - OmniTouch provides capabilities similar to that of a mouse or touchscreen: X and Y location in 2D interfaces and whether fingers are "clicked" or hovering, enabling a wide variety of interactions. Thus, it is now conceivable that anything one can do on today's mobile devices, they could do in the palm of their hand.




Other Videos By Microsoft Research


2016-10-11Bringing Physics to the Surface
2016-10-11Separability of Spatial Manipulations in Multi-touch Interfaces
2016-10-11Augmenting Interactive Tables with Mice & Keyboards
2016-10-11Interactions in the Air: Adding Further Depth to Interactive Tabletops
2016-10-11Simulating Grasping Behavior on an Imaging Interactive Surface
2016-10-11Design and Evaluation of Interaction Models for Multi Touch Mice
2016-10-11Pen + Touch = New Tools
2016-10-11Combining Multiple Depth Cameras and Projectors for Interactions On, Above, and Between Surfaces
2016-10-11Using a Depth Camera as a Touch Sensor
2016-10-11Data Miming: Inferring Spatial Object Descriptions from Human Gesture
2016-10-11OmniTouch: Wearable Multitouch Interaction Everywhere
2016-10-11Phone as a Pixel: Enabling Ad-hoc, Large-Scale Displays Using Mobile Devices
2016-10-07Latin American Faculty Summit 2016 - Intelligent Devices
2016-10-07Pixel based Interaction Techniques
2016-10-07InfraStructs: Fabricating Information Inside Physical Objects for Imaging in the Terahertz Region
2016-10-07A Webbased Frontend for Easy Interaction with the Inductive Programming System Igor
2016-10-07Toward Telelocomotion
2016-10-07CrossMotion Fusing Device and Image Motion for User Identification, Tracking and Device Association
2016-10-07FoveAR: Combining an Optically See-Through Near-Eye Display with Projector-Based Spatial AR
2016-10-05Using Mobile Devices to Lower Communication Barriers and Provide Autonomy with Gaze-Based AAC
2016-10-05Counterfactual Evaluation and Learning from Logged User Feedback