Abstract

This paper investigates the possibility of augmenting an Unmanned Aerial Vehicle (UAV) navigation system with a passive video camera in order to cope with long-term GPS outages. The paper proposes a vision-based navigation architecture which combines inertial sensors, visual odometry, and registration of the on-board video to a geo-referenced aerial image. The vision-aided navigation system developed is capable of providing high-rate and drift-free state estimation for UAV autonomous navigation without the GPS system. Due to the use of image-to-map registration for absolute position calculation, drift-free position performance depends on the structural characteristics of the terrain. Experimental evaluation of the approach based on offline flight data is provided. In addition the architecture proposed has been implemented on-board an experimental UAV helicopter platform and tested during vision-based autonomous flights.

Highlights

  • One of the main concerns which prevents the use of Unmanned Aerial Vehicle (UAV) systems in populated areas is the safety issue

  • This paper proposes a navigation system which can cope with GPS outages

  • The reference image of the area used for this experiment is an orthorectified aerial image of 1 meter/pixel resolution with a submeter position accuracy

Read more

Summary

Introduction

One of the main concerns which prevents the use of UAV systems in populated areas is the safety issue. Autonomous unmanned aerial vehicles usually rely on a GPS position signal which, combined with inertial measurement unit (IMU) data, provide high-rate and drift-free state estimation suitable for control purposes. Advanced cruise missiles implement a complex navigation system based on GPS, TERNAV, and Digital Scene Matching Area Correlation (DSMAC). The contribution of this work is to explore the possibility of using a single video camera to measure both relative displacement (odometry) and absolute position (image registration). We believe that this is a very practical and innovative concept. The approach presented is implemented on a Yamaha Rmax unmanned helicopter and tested on-board during autonomous flight-test experiments

Sensor Fusion Architecture
Visual Odometry
Image Registration
Sensor Fusion Algorithms
UAV Platform
Experimental Results
Conclusions and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call