Abstract

Multi-robot simultaneous localization and mapping (MR-SLAM) is of great importance for enhancing the efficiency of large-scale environment exploration. Despite remarkable advances in schemes for cooperation, there is a critical lack of approaches to handle multiple uncertainties inherent to MR-SLAM in large-scale environments. This paper proposes a multi-uncertainty captured multi-robot lidar odometry and mapping (MUC-LOAM) framework, to quantify and utilize the uncertainties of feature points and robot mutual poses in large-scale environments. A proposed hybrid weighting strategy for pose update is integrated into MUC-LOAM to handle feature uncertainty from distance changing and dynamic objects. A devised Bayesian Neural Network (BNN) is proposed to capture mutual pose uncertainty. Then the covariance propagation of quaternions to Euler angles conversion is leveraged to filter out unreliable mutual poses. Another covariance propagation through coordinate transformations in nonlinear optimization improves the accuracy of map merging. The feasibility and enhanced robustness of the proposed framework for large-scale exploration are validated on both public datasets and real-world experiments.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.