Abstract

The Global Navigation Satellite System (GNSS) and Strapdown Inertial Navigation System (SINS) integrated navigation system has been widely used on Unmanned Aerial Vehicles (UAVs) nowadays. However, the system cannot operates properly when GNSS is invalid. In this paper we propose a systematic framework of visual aided inertial autonomous navigation system to solve this problem. The proposed system fuses the information of three sub navigation systems such as inertial navigation system, satellite navigation system and visual navigation system together. This system gets video information from a monocular camera mounted on a UAV. Then the consecutive image frames are matched by detection and matching of oriented FAST and rotated BRIEF (ORB). An algorithm of treble mismatching point pairs elimination is developed to eliminate outliers, and an optimizing method based on the evaluation of measurements using motion models is developed to reduce errors. Classical Kalman filter and multi-rate Kalman filter are both applied to effectuate the positioning of visual aided inertial navigation system. Flight experiments were done based on an own-built multi-rotor UAV. The obtained results show that the proposed system can operate normally and provide accurate navigation information when GNSS is shortly invalid.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call