SLAM (Simultaneous Localization And Mapping, SLAM) is the key technology for mobile robots to obtain information about their position and surrounding environment, but both laser radar and vision sensors have limitations in solving SLAM problems, for example: a single laser sensor can’t do mapping effectively for hollowed-out obstacles, while a single vision sensor can’t work well in areas without texture. Besides, the existing loop detection technology still has some problems such as large calculation resources and errors. To solve the above problems, an improved SLAM algorithm laser and vision fusion was presented in this paper. In this algorithm, a Gaussian statistical linear interpolation algorithm was used to interpolate the laser point cloud, and then the interpolated laser raster map and visual raster map are fused by the Bayesian method to obtain a denser and more accurate map. Furthermore, the histogram was used to improve the loop detection part to further improve the positioning accuracy. The improved loop detection algorithm was simulated and verified by running the TUM data set. On this basis, a mobile robot experiment system was built to do mapping in the laboratory scene. The experimental results show that the algorithm proposed in this paper can map the hollowed-out area accurately and effectively, it plays a significant role in improving the drawing effect of the robot, and can provide more accurate location information, which has good practicability.
Read full abstract