טכניון מכון טכנולוגי לישראל
הטכניון מכון טכנולוגי לישראל - בית הספר ללימודי מוסמכים  
M.Sc Thesis
M.Sc StudentMichael Chojnacki
SubjectVision-Based Target Trajectory and Ego-Motion
Estimation Using Incremental Light Bundle
Adjustment
DepartmentDepartment of Autonomous Systems and Robotics
Supervisors Assistant Professor Indelman Vadim
Full Professor Rivlin Ehud


Abstract

This work presents a vision-based, computationally-efficient method for simultaneous robot motion estimation and dynamic target tracking, while operating in GPS-denied unknown or uncertain environments. While numerous vision-based approaches are able to achieve simultaneous ego-motion estimation along with detection and tracking of moving objects (DTMO), many of them require performing a bundle adjustment (BA) optimization, which involves the estimation of the 3D points observed in the process. One of the main concerns in robotics applications is the computational efforts required to sustain extended operation. The BA process is performed incrementally as new camera poses and new measurements arrive, constantly increasing the computational complexity of the problem. Considering applications for which the primarily interest is highly-accurate on-line navigation rather than mapping, the number of involved variables can be considerably reduced by avoiding the explicit 3D structure reconstruction and consequently save processing time. We take advantage of the light bundle adjustment (LBA) method, which allows for ego-motion calculation without the need for 3D points on-line reconstruction, and thus, significantly reduce computation time compared to BA. The proposed method integrates the target tracking problem into the LBA framework, yielding a simultaneous ego-motion estimation and tracking process, in which the target is the only explicitly on-line reconstructed 3D point. Furthermore, our method makes use of the recently developed Incremental Smoothing and Mapping (iSAM) technique, which allows for re-use of calculations in order to further reduce the computational cost. Our approach is compared to BA and target tracking in terms of accuracy and computational complexity using simulated aerial scenarios and real-imagery experiments performed at the Autonomous Navigation and Perception Lab (ANPL) at the Technion.