Speaker
Description
Autonomous quadrotors will soon play a major role in search-and-rescue and remote-inspection missions, where a fast response is crucial. Quadrotors have the potential to navigate quickly through unstructured environments, enter and exit buildings through narrow gaps, and fly through collapsed buildings. However, their speed and maneuverability are still far from those of birds and human pilots. Autonomous, vision-based agile navigation through unknown, indoor environments poses a number of challenges for robotics research in terms of perception, state estimation, planning, and control. In this talk, I will show how machine learning methods united with the power of new, low-latency sensors, such as event-based cameras, allow drones to achieve unprecedented speed and robustness by relying solely on the use of passive cameras, inertial sensors, and onboard computing.