Abstract

Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV’s navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results.

Highlights

  • At present time, Micro Air Vehicles (MAVs) are becoming very popular in various applications.They are affordable and flexible in terms of location due to their size and weight

  • In order to correctly integrate relative measurements, the navigation system is augmented with the Stochastic Cloning Filter (SCF) approach [23]

  • The various aspects from calibration, to ego-motion estimation, to covariance estimation, to the final integrated navigation system are studied

Read more

Summary

Introduction

Micro Air Vehicles (MAVs) are becoming very popular in various applications. Due to the geometry of the calibration object, 3D feature points of the laser rangefinder are recovered in image coordinates The pose between both sensors is obtained by solving the Perspective-3-Point (P3P) problem [10]. The authors of [11] propose an approach that estimates the depth information of image features, which do not necessarily lie on the projected laser line This is possible by assuming ground vehicle motion and a structured environment, such as indoors. The objective is to obtain an accurate navigation solution and become more flexible in terms of the MAV’s surroundings compared to the existing laser-scan-matching approach [4] This includes many aspects from the deep integration of two complementary sensors, to ego-motion estimation, to the final integrated navigation system. This results in a more accurate and robust navigation solution

System Overview
Coordinate Systems and Transformations
Calibration
Calibration Object
Geometric Calculation
Recovering the Feature Point in the Image
Ego-Motion Estimation
Integrated Navigation System
Simulation
Experimental
Covariance Estimation
Positioning
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.