Abstract

Relative localization is a prerequisite for aerial swarms. Vision is an important method of relative localization. However, the loss of visual information caused by the limited field of view and occlusion limits the performance of vision-based relative localization. This paper proposes a novel method to solve visual loss in relative localization. The method utilizes the intermittent Kalman filter to fuse the visual information with inertial measurement unit (IMU) data. Then, we design an IMU observer to reduce estimation error. Furthermore, to decrease abrupt oscillation of estimation, we employ the RBF neural network to fuse Visual-Inertial Odometry (VIO), Ultra Wide Band (UWB) information, and the result of the Observer-based Intermittent Kalman Filter (OIKF). Real-world experiments show that the proposed method for solving the visual loss problem reduces the mean square error by more than 30% compared with an optimization method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.