|M.Sc Student||Jigalin Anton|
|Subject||Vision-Based Relative State Estimation of Unknown Dynamic|
Targets Using Multiple Stereo Rigs
|Department||Department of Mechanical Engineering||Supervisor||Professor Pinchas Gurfil|
|Full Thesis text|
Estimating the pose, motion and structure of cooperative targets using on board sensors is a challenging problem; the problem becomes even more challenging when the target is non-cooperative. The absence of a priori information about the target and cross-link data sharing encourages one to look for the best possible approximation of the target's pose, motion and structure. One method for non-cooperative target state estimation is to use vision sensors, which provide extensive measurement data, and can be used to acquire information in a passive manner.
This work investigates the use of multiple stereo-vision sensors for both cooperative and non-cooperative target relative pose, motion and structure estimation. A computer-vision feature-matching algorithm produces input data for a recursive filtering algorithm, which provides relative state estimation.
A newly-developed initialization step and the proposed target structure estimation technique decrease the ambiguity in the target center of mass location. The effectiveness of the proposed filtering scheme is compared to the theoretical limitations predicted by the Cramer-Rao theory.
This work also provides an extensive numerical study of possible system parameters, comparing between two filtering techniques and different relative motion models. The performance of the estimation algorithm is evaluated by Monte-Carlo simulations. The estimation accuracy of the iterated extended Kalman filter is compared to the unscented Kalman filter. It is also shown that a simple kinematic model can work well, when combined with frequent measurements. The effect of the position and number of cameras constituting the stereo rigs is also investigated. The proposed method was validated using robotic devices in the Distributed Space Systems Laboratory.