Abstract
In urban environments, simultaneous localization and mapping (SLAM) are essential for autonomous driving. Most light detection and ranging (LiDAR) SLAM methodologies have been developed for relatively static environments, despite real-world environments having many dynamic objects such as vehicles, bicycles, and pedestrians. This paper proposes an efficient and robust LiDAR SLAM. Our SLAM framework leverages the estimated background model to achieve robust motion estimation in dynamic urban environments. Based on probabilistic object estimation, the dynamic removal module estimates a nonparametric background model to recognize dynamic objects. This module estimates the probability of the difference of the range values from the accumulated LiDAR frames. Then, dynamic objects are removed by adapting the sensor velocity from the estimated ego-motion. In the local mapping module, our method optimizes the LiDAR motion considering the dynamic characteristics of LiDAR point clouds. Finally, the proposed method results in a global map with static point clouds and accurate LiDAR motion with global pose optimization. We tested the proposed method on the well-known public dataset (KITTI) and the custom dataset with complex environments, including various moving objects. Comparisons with state-of-the-art (SOTA) methods demonstrate that the our approach is more robust and efficient. For example, the proposed method performed an average 0.63% and <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$0.18^{\circ }/100\;m$ </tex-math></inline-formula> errors on the KITTI dataset with <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$0.96ms$ </tex-math></inline-formula> processing time that convinces real-time processing.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Intelligent Transportation Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.