טכניון מכון טכנולוגי לישראל
הטכניון מכון טכנולוגי לישראל - בית הספר ללימודי מוסמכים  
M.Sc Thesis
M.Sc StudentBen-Elisha Yair
SubjectCooperative Multi-Robot Belief Space Planning for
Visual-Inertial Navigation and Online Sensor
Calibration
DepartmentDepartment of Aerospace Engineering
Supervisor Assistant Professor Vadim Indelman


Abstract

High accuracy navigation in GPS-deprived uncertain environments is of prime importance to various robotics applications. In such scenarios, it has been recently shown that online sensor calibration and multi-robot collaboration, whereby robots make mutual observations of the environment or perform relative observations of each other, can significantly enhance navigation accuracy. However, these approaches typically consider a passive setting, where robot actions are externally determined. On the other hand, belief space planning (BSP) approaches account for different sources of uncertainty, thus identifying actions that improve certain aspects in inference, such as accuracy. Yet, existing BSP approaches typically do not consider sensor calibration in the mentioned problem setting, nor a visual-inertial SLAM setup.

In this research, we contribute single-robot and multi-robot BSP approaches for active sensor calibration considering a visual-inertial SLAM system. To that end, we maintain a belief over both robot’s pose and sensor calibration, and reason how that belief would evolve for different actions while considering partially unknown and uncertain environments. In particular, we leverage the recently developed concept of IMU pre-integration and develop appropriate factor graph formulation for future beliefs to facilitate computationally efficient inference within BSP.

Another key aspect of our approach are indirect multi-robot observation updates given the states of different robots are correlated. This concept allows for a subset of robots to carry on their individual (possibly time-critical) tasks while preserving high accuracy estimation by relying on other expendable robots to make appropriate observations of the environment. We study our approach in high-fidelity synthetic simulation and show the determined actions can lead to significantly improved estimation accuracy.