Abstract
Precise positioning in an indoor environment is a challenging task because it is difficult to receive a strong and reliable global positioning system (GPS) signal. For existing wireless indoor positioning methods, ultra-wideband (UWB) has become more popular because of its low energy consumption and high interference immunity. Nevertheless, factors such as indoor non-line-of-sight (NLOS) obstructions can still lead to large errors or fluctuations in the measurement data. In this paper, we propose a fusion method based on ultra-wideband (UWB), inertial measurement unit (IMU), and visual simultaneous localization and mapping (V-SLAM) to achieve high accuracy and robustness in tracking a mobile robot in a complex indoor environment. Specifically, we first focus on the identification and correction between line-of-sight (LOS) and non-line-of-sight (NLOS) UWB signals. The distance evaluated from UWB is first processed by an adaptive Kalman filter with IMU signals for pose estimation, where a new noise covariance matrix using the received signal strength indicator (RSSI) and estimation of precision (EOP) is proposed to reduce the effect due to NLOS. After that, the corrected UWB estimation is tightly integrated with IMU and visual SLAM through factor graph optimization (FGO) to further refine the pose estimation. The experimental results show that, compared with single or dual positioning systems, the proposed fusion method provides significant improvements in positioning accuracy in a complex indoor environment.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.