UCL MotionInput v3.2 Accessible Planetarium - Facial Navigation and Space Unfolded Projection
UCL MotionInput v3.2 Accessible Planetarium - Facial Navigation and Space Unfolded Projection
A children's ward in a hospital would benefit their patients greatly with a planetarium that they could project into the room.
This project uses a Space Unfolded algorithm developed by Damian Ziaber, to re-render planetarium software, flattened from the user's first person view given the room items and corners of your room.
Both Facial Navigation and Speech commands are integrated from across the UCL MotionInput v3.2 project. This leads to a fully accessible planetarium in your home, hospital or care home, which can be used by either facial movements and gestures, or with speech commands.
This solution uses a regular webcam, projectors and your Windows PC.
Developed by Eloise Vriesman, Rebecca Conforti, Julia Xu and Robin Stewart, with integration from Damian Ziaber.
Supervised by Prof Dean Mohamedally
UCL Computer Science
Part of the UCL MotionInput v3.2 software project
www.touchlesscomputing.org