UCL MotionInput 3 - Introducing In-Air Multitouch

Channel:
Subscribers:
53
Published on ● Video Link: https://www.youtube.com/watch?v=5ub6BxkWl-A



Duration: 0:28
16 views
1


UCL MotionInput 3 - Introducing In-Air Multitouch
Our group at UCL have released UCL MotionInput Version 3 for Windows PCs. Navigate your browsers and apps with two points multitouch in the air, along with speech commands like "click", "double click" and "right click".
See www.touchlesscomputing.org for more information.




Other Videos By DoctorDeano


2022-06-28UCL MotionInput 3 - "Minority Report" style swipe to change pages (custom gesture recorder)
2022-06-28UCL MotionInput 3 - In-Air Light Gun (custom gesture recorder) - Duck Hunt simulation
2022-06-28UCL MotionInput 3 - Trackmania with In Air Joypad
2022-06-28UCL MotionInput 3 - Halo Spartan Assault - played with hands based In-Air Joypad
2022-06-28UCL MotionInput 3 - Facial Navigation - Using Eye Gaze with Monkey Island on PC
2022-06-28UCL MotionInput 3 - Facial Navigation - Navigating Google Maps StreetView using NoseNav with Speech
2022-06-28UCL MotionInput 3 - Facial Navigation - Browsing Netflix with NoseNav and Speech
2022-06-28UCL MotionInput 3 - Facial Navigation - Eye Gaze on Google Maps with Speech
2022-06-28Forza Horizon 5 demo test of virtual In-Air Joypad with UCL MotionInput v3
2022-05-30UCL MotionInput 3 - Introducing In-Air Multitouch
2022-05-26UCL MotionInput 3 - Introducing In-Air Multitouch
2022-05-24MotionInput 3 with Overcooked 2 - Hands based Joypad mode
2022-05-24MotionInput 3 - “Steep” Head based control with In Air Hit Triggers
2022-05-24MotionInput 3 for Raspberry Pi - education use cases
2022-05-24MotionInput 3 with Scratch programming on a Raspberry Pi
2022-05-24MotionInput 3 - Walking on the Spot demonstrations (v2 and v3)
2022-05-24MotionInput 3 with PUBG
2022-05-24MotionInput 3 with Minecraft
2022-05-24MotionInput V2 (2021) - Development of the Exercises with Gaming modes
2022-05-11UCL Coding Curriculum Hackathon 2022
2022-05-06UCL MotionInput 3 - Raspberry Pi edition: Hands and Facial Navigation, with local speech recognition