טכניון מכון טכנולוגי לישראל
הטכניון מכון טכנולוגי לישראל - בית הספר ללימודי מוסמכים  
M.Sc Thesis
M.Sc StudentGreenhut Yaron
SubjectSensor Fusion of GPS with Omnidirectional Image Registration
for Off-Road Autonomous Vehicle Path Tracking
DepartmentDepartment of Agricultural Engineering
Supervisor Professor Emeritus Per-Olof Gutman
Full Thesis textFull thesis text - English Version


Abstract

One of the main methods in use today for finding a vehicle location is by GPS (Global Positioning System) or DGPS (Differential Global Positioning System). These systems can identify the location of a vehicle to an accuracy of less than 1 meter (DGPS) in optimal conditions.

The accuracy obtained by (D)GPS  is in many cases not sufficient to navigate an off-road trajectory. Therefore most (D)GPS based systems are complemented with methods that compensate the regular GPS errors, such as odometer readings or an Inertial Navigation Unit (INU)  that work in conjunction with the GPS. The INU itself has high accuracy, at best of centimeters. However, INU is accurate only for short periods of time, and relies on external references for calibration, e.g. the GPS, or if possible, planned stops of the vehicle, whereby it is known that the true velocity is zero. In situations when the GPS accuracy seriously deteriorates for significant periods of time (where we note that a few minutes are a long time), and the INU is in a state that it cannot compensate for the GPS errors. These situations demand sensors and methods that are not influenced by the GPS inaccuracy and may be applied in conjunction with the GPS/INU readings.

The additional sensory system suggested here is based on a vision sensor along with path detection algorithms that may recognize the characteristics of an off-road path such as road boundary detection, road width estimation near the vehicle location and road-car relative orientation (position and azimuth angle). The uniqueness of the solution lies in its vision sensor: an omni-directional lens with a high resolution digital camera. The wide field of view grants us a better scene context understanding and larger amount of data to rely on when estimating road characteristics. The omni-directional lens offers us not only the view of the vehicle's front, but also side views and the rear view of the vehicle, which in turns results in better estimation of vehicle position and orientation on the path.

In this thesis we use image registration techniques with aerial photos or a map of the area, and to integrate the vision sensor with the GPS outputs, using sensor fusion techniques. In such a way, the total position error may be substantially decreased, and in fact, the GPS will be calibrated in real time.