Abstract
This paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a GPS receiver, and a vision system, to improve its navigation performance (in real time or in post processing phase) exploiting formation flying deputies equipped with GPS receivers. The key concept is to integrate differential GPS and visual tracking information within a sensor fusion algorithm based on the Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on the filtering algorithm. Then, flight testing strategy and experimental results are presented. In particular, cooperative navigation output is compared with the estimates provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magneticand inertial-independent accurate information.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have