Abstract

Visual odometry provides astronauts with accurate knowledge of their position and orientation. Wearable astronaut navigation systems should be simple and compact. Therefore, monocular vision methods are preferred over stereo vision systems, commonly used in mobile robots. However, the projective nature of monocular visual odometry causes a scale ambiguity problem. In this paper, we focus on the integration of a monocular camera with a laser distance meter to solve this problem. The most remarkable advantage of the system is its ability to recover a global trajectory for monocular image sequences by incorporating direct distance measurements. First, we propose a robust and easy-to-use extrinsic calibration method between camera and laser distance meter. Second, we present a navigation scheme that fuses distance measurements with monocular sequences to correct the scale drift. In particular, we explain in detail how to match the projection of the invisible laser pointer on other frames. Our proposed integration architecture is examined using a live dataset collected in a simulated lunar surface environment. The experimental results demonstrate the feasibility and effectiveness of the proposed method.

Highlights

  • The astronaut navigation system is one of the most important systems for manned missions on the lunar surface, as it keeps astronauts safe while exploring previously unknown environments and provides accurate positions for scientific targets

  • Most of the research in Visual odometry (VO) has been performed using a stereo vision scheme, which is certainly not an optimal vision configuration for an ideal wearable astronaut navigation system, because it is less compact and less power-saving compared to monocular vision

  • The hardware of the astronaut navigation system consists of five components: an industrial camera (MV-VE141SC/SM, Microvision, Xi’an, China; image dimension: 1392 pixels × 1040 pixels, focal length: 12 mm, max. frequency: 10 Hz), a laser distance meter (LDM) (CLD-A with RS232 port, Chenglide, Beijing, China; accuracy: ±2 mm, max. frequency: 4 Hz), a specially designed platform for holding these two devices rigidly and provision of power from the on-suit batteries, an industrial computer (CPU: Intel core i5) to control the acquisition of images and laser readings, and an iPad to control the computer triggering the signal to the camera and the distance meter through a local Wi-Fi network

Read more

Summary

Introduction

The astronaut navigation system is one of the most important systems for manned missions on the lunar surface, as it keeps astronauts safe while exploring previously unknown environments and provides accurate positions for scientific targets. Ordonez presented in detail the extrinsic calibration method of a digital camera and a LDM Later, this low-cost 2D measurement system was extended to reconstruct scaled 3D models of buildings [20]. To obtain more accurate metric scaled navigation results, a flexible and robust extrinsic calibration method between the camera and the LDM is presented. This whole calibration process requires almost no manual intervention and is robust to gross errors. We describe how to match the invisible laser spot on other frames in detail and how to correct the scale drift using distance measurement and calibration parameters In principle, this enhanced monocular VO method is certainly applicable for a mobile robot (e.g., a rover).

Proposed Approach for Monocular Visual Odometry
Robust Calibration of the Navigation Platform
Detection of the Laser Pointer Projection Center
Extrinsic Calibration of the Laser Distance Meter and the Camera
Laser Pointer Projection Based on the Index Table
Geometrical Calibration of LDM and Camera
A Global Scaled Monocular VO Algorithm for Astronaut Navigation
Robust Relative Motion Estimation
Laser Pointer Matching
Robust Scale Estimation
Computation of Relative Scale
Correction of Scale Drift with Global Scale Constraints
Experiments and Results
Summary and Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call