Abstract

Due to the influence of overlapping objects, different materials, and uneven lighting in the indoor environment, the mobile robot equipped with only a single sensor cannot achieve accurate positioning and complete mapping. Aiming at the problem, this paper studies the multi-sensor fusion localization and mapping of indoor mobile robots. The research fuses multiple sensor data from 2D lidar, depth camera, IMU, and wheel encoder. Specifically, on the one hand, this paper uses the extended Kalman filter algorithm to fuse the wheel odometer calculated from the encoder data with the inertial sensing unit data, which reduces the drift error and improves the accuracy of the robot's own localization. On the other hand, the region proximity algorithm integrates richer visual information into the 2D laser data, which makes up for the spatial perception defect of single-line laser mapping and improves the spatial integrity of robot mapping. The simulation experiments in this paper verify that the proposed method can effectively improve the localization accuracy and mapping integrity of the indoor robot.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call