Date of Award
Doctor of Philosophy (PhD)
This thesis is concerned with real-time monocular visual-inertial simultaneous localization and mapping (VI-SLAM) with application to Long Term Autonomy. Given a sensor rig capable of making visual and inertial measurements, accurate real-time estimation of its position and orientation (pose) as well as the creation of a scale-correct map of the surrounding environment is desired. This estimation task requires accurate calibration of both intrinsic and extrinsic properties of the visual and inertial sensors. As such, the continuous estimation of these calibration parameters is also desired. Three novel methods are presented, covering real-time VI-SLAM, self calibration, and change detection. Together they form a basis for long term localization and mapping robust to changes in calibration.
The VI-slam methodology is motivated by the requirement to produce a scale-correct visual map, in an optimization framework that is able to incorporate relocalization and loop closure constraints. Special attention is paid to achieve robustness to many real world difficulties, including degenerate motions and unobservablity. A variety of helpful techniques are used, including: a relative manifold representation, a minimal-state inverse depth parameterization, and robust non- metric initialization and tracking. Also presented is an extensible framework for real-time self-calibration of cameras in the SLAM setting. The system is demonstrated to calibrate both pinhole and fish-eye camera models from unknown initial parameters while seamlessly solving the maximum likelihood online SLAM problem in real-time. Self-calibration is performed by tracking image features, and requires no predetermined calibration target. By automatically identifying and using only those portions of the sequence that contain useful information for the purpose of calibration the system achieves accurate results incrementally and in constant-time vs. the number of images. Finally, a framework for online SLAM and self-calibration is presented which can detect and handle significant change in the calibration parameters. A novel technique is presented to detect the probability that a significant change is present in the calibration parameters. The system is then able to re-calibrate. Maximum likelihood trajectory and map estimates are computed using an asynchronous and adaptive optimization. The system requires no prior information and is able to initialize without any special motions or routines, or in the case where observability over calibration parameters is delayed. Both self-calibration frameworks are extensible and able to cover any calibration parameters which can be estimated from the measurements. The contributions are individually evaluated in a number of experiments with real data. Specific focus is placed on accuracy and real-time performance.
Keivan, Nima, "Monocular Visual-Inertial Slam and Self Calibration for Long Term Autonomy" (2017). Computer Science Graduate Theses & Dissertations. 151.