Abstract

We propose fusion of stereo visual odometry and ranging for Simultaneous Localization And Mapping (SLAM). Two basic processes of feature- and keyframe-based stereo visual odometry (Tracking and Local mapping) are operated in the algorithm, while saving all local keyframes, map points, and visual constraints in a global map database, as well as the available ranging constraints of the keyframes. At the end of the sequence, state estimates by visual odometry are fused with ranging measurements to mitigate the inherent accumulating errors in the process and achieve global consistency. We formulate a simple graphical representation for the fusion, and perform least squares estimation with the sparse Levenberg-Marquardt algorithm to minimize the summation of the re-projection and distance squared errors over all the defined constraints in the global graph. The proposed algorithm is evaluated both qualitatively and quantitatively on a real stereo image dataset with synthetically generated distance measurements with super-imposed Gaussian white noise. The experimental results show that the proposed SLAM algorithm effectively compensates the cumulative bias in visual odometry. Furthermore, the global accuracy of the trajectory estimation is comparable to the one of stereo vision-only SLAM with closing loops.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call