In recent years, the rise of unmanned technology has made Simultaneous Localization and Mapping (SLAM) algorithms a focal point of research in the field of robotics. SLAM algorithms are primarily categorized into visual SLAM and laser SLAM, based on the type of external sensors employed. Laser SLAM algorithms have become essential in robotics and autonomous driving due to their insensitivity to lighting conditions, precise distance measurements, and ease of generating navigation maps. Throughout the development of SLAM technology, numerous effective algorithms have been introduced. However, existing algorithms still encounter challenges, such as localization errors and suboptimal utilization of sensor data. To address these issues, this paper proposes a tightly coupled SLAM algorithm based on similarity detection. The algorithm integrates Inertial Measurement Unit (IMU) and LiDAR odometry modules, employs a tightly coupled processing approach for sensor data, and utilizes curvature feature optimization extraction methods to enhance the accuracy and robustness of inter-frame matching. Additionally, the algorithm incorporates a local keyframe sliding window method and introduces a similarity detection mechanism, which reduces the real-time computational load and improves efficiency. Experimental results demonstrate that the algorithm achieves superior performance, with reduced positioning errors and enhanced global consistency, in tests conducted on the KITTI dataset. The accuracy of the real trajectory data compared to the ground truth is evaluated using metrics such as ATE (absolute trajectory error) and RMSE (root mean square error).
Read full abstract