UCL MotionInput 3 - Facial Navigation Part 2 - Nose and Eye Gaze, with federated speech
Channel:
Subscribers:
53
Published on ● Video Link: https://www.youtube.com/watch?v=9S4WMostGDs
Part 2 of 2 for Facial Navigation with face based gestures, eye gaze detection with OpenVino, nose based navigation and federated speech via UCL Ask KITA v1.0, powered by VOSK engine.
Software download at http://software.cs.ucl.ac.uk/MotionInput.html
www.facialnavigation.com
Other Videos By DoctorDeano
2022-05-24 | MotionInput 3 with Minecraft |
2022-05-24 | MotionInput V2 (2021) - Development of the Exercises with Gaming modes |
2022-05-11 | UCL Coding Curriculum Hackathon 2022 |
2022-05-06 | UCL MotionInput 3 - Raspberry Pi edition: Hands and Facial Navigation, with local speech recognition |
2022-04-22 | UCL MotionInput 3 with FISECARE - care homes demonstration |
2022-03-25 | UCL MotionInput 3 - Basics of Touchless Navigation |
2022-03-25 | UCL MotionInput 3 - use case for physiotherapy |
2022-03-25 | UCL MotionInput 3 - Exercises for Population Health - movement in any existing games |
2022-03-25 | UCL MotionInput 3 - use case for carehomes |
2022-03-25 | UCL MotionInput 3 - Any screen as a touchscreen, inking in the air with depth, keyboard in the air |
2022-03-25 | UCL MotionInput 3 - Facial Navigation Part 2 - Nose and Eye Gaze, with federated speech |
2022-03-25 | UCL MotionInput 3 - Middleware and Facial Navigation Part 1 |
2022-03-25 | UCL MotionInput 3 - System Architecture and Capabilities |