Planetary rover autonomous localization is paramount for a planetary surface exploration mission. However, existing methods demonstrate limited localization accuracy, mostly due to the unstructured texture characterization of planetary surface. In response, this study presents a novel Neural Radiance Field (NeRF) driven visual odometry correction method that allows for high-precision 6-DoF rover pose estimation and local map pruning. First, an innovative image saliency evaluation approach, combining binarization and feature detection, is introduced to meticulously select landmarks that are conducive to rover re-localization. Subsequently, we conduct 3D reconstruction and rendering of the chosen landmarks based on a-priori knowledge of planetary surface images and their Neural Radiance Field (NeRF) models. High-precision odometry correction is achieved through the optimization of photometric loss between NeRF rending images and real images. Simultaneously, the odometry correction mechanism is employed in an autonomous manner to refine the NeRF model of the corresponding landmark, leading to an improved local map and gradually enhanced rover localization accuracy. Numerical simulation and experiment trials are carried out to evaluate the performance of the proposed method, results of which demonstrate state-of-the-art rover re-localization accuracy and local map pruning.
Read full abstract