טכניון מכון טכנולוגי לישראל
הטכניון מכון טכנולוגי לישראל - בית הספר ללימודי מוסמכים  
M.Sc Thesis
M.Sc StudentKaufman Omri
SubjectSpace Navigation with an Omni-Directional Vision Sensor
DepartmentDepartment of Autonomous Systems and Robotics
Supervisor Professor Pinchas Gurfil


Abstract

With the onset of autonomous spacecraft formation flying missions, the ability of satellites to autonomously navigate relatively to other space objects has become essential. To implement relative navigation, relative measurements should be taken, and fused using relative state estimation. An efficient way to generate such information is by using vision-based measurements. Cameras are passive, low-energy, and information-rich sensors that do not actively interact with other space objects. However, pointing cameras with a conventional field-of-view to other space objects requires much a-priori initialization data; in particular, dedicated attitude maneuvers are needed, which may interfere with the satellite’s main mission. One way to overcome these difficulties is to use an omnidirectional vision sensor, which has a 360-degree horizontal field of view. In this work, we will discuss the development of an omnidirectional vision sensor for satellites, which can be used for relative navigation, formation flying, and space situational awareness. The study includes the development of the measurement equations, dynamical models, and state estimation algorithms, as well as an experimental investigation conducted at the Distributed Space Systems Laboratory.


The estimation is performed using an Extended Kalman Filter, due to the nonlinear measurement model, describing the omnidirectional vision sensor. The relative dynamics are based on the linear Clohessy-Wiltshire equations. The structure of the catadioptric camera is investigated, and three measurement models and calibration methods are examined and compared. The study shows that one model is superior with respect to estimation implementation. Computer vision algorithms are examined and detection methods are presented. The performance of the estimator is tuned mainly by the process noise covariance matrix parameters with some additional tuning of the measurement noise covariance. These matrices are optimized to improve results.


In addition, a numerical study, in which low Earth orbit motion of two satellites and the omnidirectional vision sensor measurements are simulated, is provided. The non-observable parameters of a mono-vision sensor are discussed. Relative state estimation results, based on mono-vision sensor, are compared with results based on sensor fusion with ground station measurements. Under the motion model of relative space dynamics, the relative state estimation error, using solely the omnidirectional vision sensor, is bounded. Nevertheless, the sensor fusion improves the estimation.

In addition, lab experiments were conducted, in which measurements were collected using an omnidirectional vision sensor, fixed on one robot and tracking another robot in a frictionless space-like environment, created on an air-table. The experimental study shows that the non-observable depth effects, derived from the mono-vision sensor, can be bounded using a good motion model and optimal filter tuning. Finally, the resolution of the laboratory system is analysed in correspondence to a real space scenario. We introduce new governing parameters for resolution scaling, and present their use in different space scenarios.