Robust, Visual-Inertial State Estimation: from Frame-based to Event-based Cameras
I will present the main algorithms to achieve robust, 6-DOF, state estimation for mobile robots using passive sensing. Since cameras alone are not robust enough to high-speed motion and high-dynamic range scenes, I will describe how IMUs and event-based cameras can be fused with visual information to achieve higher accuracy and robustness. I will, therefore, dig into the topic of event-based cameras, which are revolutionary sensors with a latency of microseconds, a very high dynamic range, and a measurement update rate that is almost a million time faster than standard cameras. Finally, I will show concrete applications of these methods in autonomous navigation of vision-controlled drones.
See more on this video at https://www.microsoft.com/en-us/research/video/robust-visual-inertial-state-estimation-from-frame-based-to-event-based-cameras/