Abstract

Due to the low weight of monocular camera, monocular Simultaneous Localization and Mapping (SLAM) is an area of popular research and promotes countless applications of micro Unmanned Aerial Vehicles (UAVs), especially in some GPS-denied indoor environments. Nevertheless, the motion of UAVs is often faster and more complex than that of ground-based robots. It would also lead to error accumulation if we calculate the trajectory only through the ego-motion. For purpose of higher accuracy and lower power cost, the fusion of visual and inertial measurement sensors is presented in UAV's indoor navigation. In this paper, we propose a novel loosely-coupled system to integrate monocular visual odometry (VO) with reading from Inertial navigation system (INS) for UAVs' indoor localization. We acquire the data from Inertial Measurement Unit (IMU) and VO results individually and map them into a same feature space. The space is defined by the Tensor Product of the individual Kernels for each source. Based on the method of Kernel Adaptive Filtering method-kernel space least mean squares (KLMS), these data are fused in the high-dimensional space. Then, experiments are made to verify this method. Compared with the vision-only algorithms, it can be confirmed that the Kernel Adaptive Filtering method makes some improvements in localization accuracy of UAVs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.