Date of Award
Doctor of Philosophy (PhD)
Long-term autonomy is the dream of many roboticists – and if a robotic system can be split into three main categories: perception, planning and control – then the biggest challenges to achieve this dream are undoubtedly faced in perception. Large scale environments that change with time – due to normal operations or even lighting changes – are typical situations the robot would encounter. As such, a Simultaneous Localization and Mapping (SLAM) system that is robust enough to handle many of these conditions is desired.
The objective of this dissertation is to present components that would lead to a robust dense visual SLAM system. It starts by exploring 3D reconstruction algorithms, showing distinctions between local and global methods and presenting an incremental and adaptive global method designed to create depth maps as a robot navigates in space.
It then introduces the concept of sensor fusion, where multiple sensors are joined to provide a higher degree of tracking accuracy. It compares different visual SLAM systems – dense and semi-dense – and shows how the inclusion of an Inertial Measurement Unit (IMU) aids considerably in tracking. It then uses this localization framework in a large scale volumetric mapping system, and shows results for both indoor and outdoor environments using real world datasets.
Finally, it explores different error metrics used in direct photometric optimization – the foundation of dense tracking systems. It introduces the Normalized Information Distance (NID), an entropy based metric that is shown to achieve high localization success rate and accuracy even in the face of extreme lighting differences.
Falquez, Juan, "Towards Robust Dense Visual Simultaneous Localization and Mapping (SLAM)" (2018). Computer Science Graduate Theses & Dissertations. 166.