Abstract

In recent decades, terrain modelling and reconstruction techniques have increased research interest in precise short and long distance autonomous navigation, localisation and mapping within field robotics. One of the most challenging applications is in relation to autonomous planetary exploration using mobile robots. Rovers deployed to explore extraterrestrial surfaces are required to perceive and model the environment with little or no intervention from the ground station. Up to date, stereopsis represents the state-of-the art method and can achieve short-distance planetary surface modelling. However, future space missions will require scene reconstruction at greater distance, fidelity and feature complexity, potentially using other sensors like Light Detection And Ranging (LIDAR). LIDAR has been extensively exploited for target detection, identification, and depth estimation in terrestrial robotics, but is still under development to become a viable technology for space robotics. This paper will first review current methods for scene reconstruction and terrain modelling using cameras in planetary robotics and LIDARs in terrestrial robotics; then we will propose camera-LIDAR fusion as a feasible technique to overcome the limitations of either of these individual sensors for planetary exploration. A comprehensive analysis will be presented to demonstrate the advantages of camera-LIDAR fusion in terms of range, fidelity, accuracy and computation.

Highlights

  • Robotic platforms have experienced a tremendous growth in their usage over the past five decades of planetary exploration missions spanning a diversity of technologies, such as: orbiting spacecrafts [1], space telescopes [2], stationary landers [3,4], etc

  • Over the past few decades planetary rovers have evolved into highly complex intelligent systems utilising a variety of onboard sensors that complement its autonomous capabilities

  • The rest of the paper is structured as follows: Section 2 provides a literature survey on current state of the art in terrain perception and scene reconstruction used on board planetary rovers or potentially useful for future missions using cameras and Light Detection And Ranging (LIDAR) technologies

Read more

Summary

Introduction

Robotic platforms have experienced a tremendous growth in their usage over the past five decades of planetary exploration missions spanning a diversity of technologies, such as: orbiting spacecrafts [1], space telescopes [2], stationary landers [3,4], etc. Exploration Rover (MER), Mars Science Laboratory (MSL) and Exobiology on Mars (ExoMars)) use cameras for terrain perception, such as; stereopsis for autonomous rover navigation (e.g., visual odometry), hazard detection (e.g., slip perception) and scientific study (e.g., MSL ChemCam) [5]. None of these missions consider loop closures, autonomous navigation to-date is a one-way traverse [5]. The rest of the paper is structured as follows: Section 2 provides a literature survey on current state of the art in terrain perception and scene reconstruction used on board planetary rovers or potentially useful for future missions using cameras and LIDAR technologies.

Terrain Perception Onboard Planetary Rovers
Cameras
Camera-LIDAR Fusion for Planetary Terrain Perception
Hardware Setup
Sensor Data
Sensor Calibration
Sensor Fusion for Surface Reconstruction
Point Cloud Triangulation and Surface Mesh Generation
Proposed Pipelined Mesh Simplification Algorithm
Performance Analysis
Discussion
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.