Abstract

Drones are becoming increasingly significant for vast applications, such as firefighting, and rescue. While flying in challenging environments, reliable Global Navigation Satellite System (GNSS) measurements cannot be guaranteed all the time, and the Inertial Navigation System (INS) navigation solution will deteriorate dramatically. Although different aiding sensors, such as cameras, are proposed to reduce the effect of these drift errors, the positioning accuracy by using these techniques is still affected by some challenges, such as the lack of the observed features, inconsistent matches, illumination, and environmental conditions. This paper presents an integrated navigation system for Unmanned Aerial Vehicles (UAVs) in GNSS denied environments based on a Radar Odometry (RO) and an enhanced Visual Odometry (VO) to handle such challenges since the radar is immune against these issues. The estimated forward velocities of a vehicle from both the RO and the enhanced VO are fused with the Inertial Measurement Unit (IMU), barometer, and magnetometer measurements via an Extended Kalman Filter (EKF) to enhance the navigation accuracy during GNSS signal outages. The RO and VO are integrated into one integrated system to help overcome their limitations, since the RO measurements are affected while flying over non-flat terrain. Therefore, the integration of the VO is important in such scenarios. The experimental results demonstrate the proposed system’s ability to significantly enhance the 3D positioning accuracy during the GNSS signal outage.

Highlights

  • Over the past 10 years, Unmanned Aerial Vehicles

  • This system provides a navigation solution in Global Navigation Satellite System (GNSS) denied environments, it is unable to distinguish between rotation and translation, especially when the vehicle rotates without any translation or with a small movement

  • GNSS and the computed positions ned ned of the GNSS antenna center PGNSS, which are derived from the Inertial Navigation System (INS) positions PIMU in the navigation frame, are utilized as measurement updates in the Extended Kalman Filter (EKF)

Read more

Summary

Introduction

Over the past 10 years, Unmanned Aerial Vehicles Due to the advancement in technology, onboard cameras have been evolved to have a small size, lightweight and, low power consumption They provide valuable measurements in term of color and texture, which are used to improve the navigation solution accuracy during the GNSS outage periods. The performance of the proposed system is evaluated in simulated non-straight flight trajectories with different banking angles This RO is assessed in a real flight for one minute, where an SAR radar is mounted on a Cessna aircraft. Two flights are performed in different places with multiple radar configurations to assess the proposed system performance This system provides a navigation solution in GNSS denied environments, it is unable to distinguish between rotation and translation, especially when the vehicle rotates without any translation or with a small movement.

System Overview of the Proposed Algorithm
Frequency Modulated Continuous Wave Radar Odometry
DataThis
Target Detection and Data Extraction
Enhanced Moncular Visual Odometry
Enhanced
Velocity Compensation
Data Fusion
Prediction Model
Observation Model ened
Hardware Setup
A K-MD2 radar module is attached to the quadcopter belly through a wooden
First Experiment flight
14. Comparisons
20. A comparison between estimated 2D
Comparison
Second
26. A estimated 2D
Figures show between the
Conclusions integrated navigation system
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call