Abstract

Accurate self-localization is a critical problem in autonomous driving system. An autonomous vehicle requires sub-meter level positioning to make motion planning. However, in urban scenarios, the common Global Navigation Satellite System (GNSS) localization suffers from various difficulties as multipath and Non-Line-Of-Sight (NLOS). The Stereo visual odometry proves to be capable of localizing the vehicle relatively by tracking the ego motion of vehicle from stereo image pairs, but with cumulative error. 3D Map is an effective tool to reduce the cumulative positioning error. In this paper, we propose to realize scene understanding from stereo camera, and further utilize city mode map including 3D building and 2D road information to improve the visual odometry. In our proposal, stereo camera is applied to generate visual odometry and reconstruct the building scene. The accumulated building scenes form local building map. We integrate the local building map and Normal Distribution Transform (NDT) map generated from 3D building map in particle filter. Lane detection result helps to rectify the inner lane positioning error and keep lane with the aid of 2D road map. We conducted a series of experiments in Hitotsubashi area of Tokyo city where locates a lot of tall buildings. The result of experiments indicates that the accumulated error of visual odometry can be corrected by the proposed method and sub-meter accuracy localization is achieved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call