Abstract

AbstractCurrent rover localization techniques such as visual odometry have proven to be very effective on short‐ to medium‐length traverses (e.g., up to a few kilometers). This paper deals with the problem of long‐range rover localization (e.g., 10 km and up) by developing an algorithm named MOGA (Multi‐frame Odometry‐compensated Global Alignment). This algorithm is designed to globally localize a rover by matching features detected from a three‐dimensional (3D) orbital elevation map to features from rover‐based, 3D LIDAR scans. The accuracy and efficiency of MOGA are enhanced with visual odometry and inclinometer/sun‐sensor orientation measurements. The methodology was tested with real data, including 37 LIDAR scans of terrain from a Mars–Moon analog site on Devon Island, Nunavut. When a scan contained a sufficient number of good topographic features, localization produced position errors of no more than 100 m, of which most were less than 50 m and some even as low as a few meters. Results were compared to and shown to outperform VIPER, a competing global localization algorithm that was given the same initial conditions as MOGA. On a 10‐km traverse, MOGA's localization estimates were shown to significantly outperform visual odometry estimates. This paper shows how the developed algorithm can be used to accurately and autonomously localize a rover over long‐range traverses. © 2010 Wiley Periodicals, Inc.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call