Abstract

Abstract. The limited field of view (FoV) of single LiDAR poses challenges for robots to achieve comprehensive environmental perception. Incorporating multiple LiDAR sensors can effectively broaden the FoV of robots, providing abundant measurements to facilitate simultaneous localization and mapping (SLAM). In this paper, we propose a panoramic tightly-coupled multi-LiDAR-inertial odometry and mapping framework, which fully leverages the properties of solid-state LiDAR and spinning LiDAR. The key of the proposed framework lies in the effective completion of multi-LiDAR spatial-temporal fusion. Additionally, we employ the iterated extended Kalman filter to achieve tightly-coupled inertial odometry and mapping with IMU data. PMLIO showcases competitive performance on multiple scenarios data, compared with state-of-the-art single LiDAR-inertial SLAM algorithms, and reaches a noteworthy improvement of 27.1% and 12.9% in max and median of absolute pose error (APE) respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call