Abstract

This paper presents a cooperative unmanned aerial vehicle navigation algorithm that allows a chief vehicle (equipped with inertial and magnetic sensors, a Global Positioning System receiver, and a vision system) to improve its navigation performance (in real time or in postprocessing phase), exploiting line-of-sight measurements from formation-flying deputies equipped with Global Positioning System receivers. The key concept is to integrate differential Global Positioning System and visual tracking information within a sensor fusion algorithm based on the extended Kalman filter. The developed concept and processing architecture are described, with a focus on the filtering algorithm. Then, flight-testing strategy and experimental results are presented. In particular, cooperative navigation output is compared with the estimates provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit accurate magnetic- and inertial-independent information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call