Combining Multiple Depth Cameras and Projectors for Interactions On, Above, and Between Surfaces

Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=JMcmPX8oy94



Duration: 4:57
4,558 views
58


Instrumented with multiple depth cameras and projectors, LightSpace is a small room installation designed to explore a variety of interactions and computational strategies relat-ed to interactive displays and the space that they inhabit. LightSpace cameras and projectors are calibrated to 3D real world coordinates, allowing for projection of graphics correctly onto any surface visible by both camera and projector. Selective projection of the depth camera data enables emulation of interactive displays on un-instrumented surfaces (such as a standard table or office desk), as well as facilitates mid-air interactions between and around these displays. For example, after performing multi-touch interactions on a virtual object on the tabletop, the user may transfer the object to another display by simultaneously touching the object and the destination display. Or the user may “pick up” the object by sweeping it into their hand, see it sitting in their hand as they walk over to an interactive wall display, and “drop” the object onto the wall by touching it with their other hand. We detail the interactions and algorithms unique to LightSpace, discuss some initial observations of use and suggest future directions.




Other Videos By Microsoft Research


2016-10-11Depth-Sensing Video Cameras for 3D Tangible Tabletop Interaction
2016-10-11SurfaceFusion: Unobtrusive Tracking of Everyday Objects in Tangible User Interfaces
2016-10-11ShapeTouch: Leveraging Contact Shape on Interactive Surfaces
2016-10-11Bringing Physics to the Surface
2016-10-11Separability of Spatial Manipulations in Multi-touch Interfaces
2016-10-11Augmenting Interactive Tables with Mice & Keyboards
2016-10-11Interactions in the Air: Adding Further Depth to Interactive Tabletops
2016-10-11Simulating Grasping Behavior on an Imaging Interactive Surface
2016-10-11Design and Evaluation of Interaction Models for Multi Touch Mice
2016-10-11Pen + Touch = New Tools
2016-10-11Combining Multiple Depth Cameras and Projectors for Interactions On, Above, and Between Surfaces
2016-10-11Using a Depth Camera as a Touch Sensor
2016-10-11Data Miming: Inferring Spatial Object Descriptions from Human Gesture
2016-10-11OmniTouch: Wearable Multitouch Interaction Everywhere
2016-10-11Phone as a Pixel: Enabling Ad-hoc, Large-Scale Displays Using Mobile Devices
2016-10-07Latin American Faculty Summit 2016 - Intelligent Devices
2016-10-07Pixel based Interaction Techniques
2016-10-07InfraStructs: Fabricating Information Inside Physical Objects for Imaging in the Terahertz Region
2016-10-07A Webbased Frontend for Easy Interaction with the Inductive Programming System Igor
2016-10-07Toward Telelocomotion
2016-10-07CrossMotion Fusing Device and Image Motion for User Identification, Tracking and Device Association