Abstract

Accurate localization of unmanned aerial vehicle (UAV) is critical for navigation in GPS-denied regions, which remains a highly challenging topic in recent research. This article describes a novel approach to multi-sensor hybrid coupled cooperative localization network (HCCNet) system that combines multiple types of sensors including camera, ultra-wideband (UWB), and inertial measurement unit (IMU) to address this challenge. The camera and IMU can automatically determine the position of UAV based on the perception of surrounding environments and their own measurement data. The UWB node and the UWB wireless sensor network (WSN) in indoor environments jointly determine the global position of UAV, and the proposed dynamic random sample consensus (D-RANSAC) algorithm can optimize UWB localization accuracy. To fully exploit UWB localization results, we provide an HCCNet system which combines the local pose estimator of visual inertial odometry (VIO) system with global constraints from UWB localization results. Experimental results show that the proposed D-RANSAC algorithm can achieve better accuracy than other UWB-based algorithms. The effectiveness of the proposed HCCNet method is verified by a mobile robot in real world and some simulation experiments in indoor environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call