Abstract

This paper presents a vision/LiDAR integrated navigation system that provides accurate relative navigation performance on a general ground surface, in GNSS-denied environments. The considered ground surface during flight is approximated as a piecewise continuous model, with flat and slope surface profiles. In its implementation, the presented system consists of a strapdown IMU, and an aided sensor block, consisting of a vision sensor and a LiDAR on a stabilized gimbal platform. Thus, two-dimensional optical flow vectors from the vision sensor, and range information from LiDAR to ground are used to overcome the performance limit of the tactical grade inertial navigation solution without GNSS signal. In filter realization, the INS error model is employed, with measurement vectors containing two-dimensional velocity errors, and one differenced altitude in the navigation frame. In computing the altitude difference, the ground slope angle is estimated in a novel way, through two bisectional LiDAR signals, with a practical assumption representing a general ground profile. Finally, the overall integrated system is implemented, based on the extended Kalman filter framework, and the performance is demonstrated through a simulation study, with an aircraft flight trajectory scenario.

Highlights

  • One of the most essential technologies for operating unmanned vehicles is to compute navigation information, including the position, velocity, and attitude of a vehicle

  • The GPS/INS integrated system, which can provide a drift-free and accurate solution of a highly dynamic vehicle, by using GPS (Global Positioning System) satellite signals to correct the solution from an Inertial Navigation System (INS), has been widely used on ground and aerial vehicles, for precise navigation purposes

  • We present a Vision/LiDAR aided integrated navigation system, for an environment where 1) GNSS signals cannot be used; 2) the altitude of the ground surface is varying; and 3) there is no known map information

Read more

Summary

Introduction

One of the most essential technologies for operating unmanned vehicles is to compute navigation information, including the position, velocity, and attitude of a vehicle. Research on detecting moving cars and pedestrians, and research on indoor path planning of UAV were developed, by combining computer vision processing results with LiDAR measurements [16,17,18] These efforts put most emphasis on detecting and localizing possible obstacles via heterogeneous sensor fusion, and suffered from generating extensive status information of the host vehicles. We present a Vision/LiDAR aided integrated navigation system, for an environment where 1) GNSS signals cannot be used; 2) the altitude of the ground surface is varying; and 3) there is no known map information.

System Illustration
Design
Integrated Navigation Filter
Results
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call