Abstract

In this work, we present a tightly-coupled fusion scheme of a monocular camera, a 6-DoF IMU, and a single unknown Ultra-wideband (UWB) anchor to achieve accurate and drift-reduced localization. Specifically, this letter focuses on incorporating the UWB sensor into an existing state-of-the-art visual-inertial system. Previous works toward this goal use a single nearest UWB range data to update robot positions in the sliding window (“position-focused”) and have demonstrated encouraging results. However, these approaches ignore 1) the time-offset between UWB and camera sensors, and 2) all other ranges between two consecutive keyframes. Our approach shifts the perspective to the UWB measurements (“range-focused”) by leveraging the propagated information readily available from the visual-inertial odometry pipeline. This allows the UWB data to be used in a more effective manner: the time-offset of each range data is addressed and all available measurements can be utilized. Experimental results show that the proposed method consistently outperforms previous methods in both estimating the anchor position and reducing the drift in long-term trajectories.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.