Abstract

Robust navigation for mobile robots over long distances requires an accurate method for tracking the robot position in the environment. Promising techniques for position estimation by determining the camera ego-motion from monocular or stereo sequences have been previously described. However, long-distance navigation requires both a high level of robustness and a low rate of error growth. In this paper, we describe a methodology for long-distance rover navigation that meets these goals using robust estimation of ego-motion. The basic method is a maximum-likelihood ego-motion algorithm that models the error in stereo matching as a normal distribution elongated along the (parallel) camera viewing axes. Several mechanisms are described for improving navigation robustness in the context of this methodology. In addition, we show that a system based on only camera ego-motion estimates will accumulate errors with super-linear growth in the distance traveled, owing to increasing orientation errors. When an absolute orientation sensor is incorporated, the error growth can be reduced to a linear function of the distance traveled. We have tested these techniques using both extensive simulation and hundreds of real rover images and have achieved a low, linear rate of error growth. This method has been implemented to run on-board a prototype Mars rover.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.