Abstract

Two-dimensional (2D) LiDAR and RGB-D camera are two widely used sensors in various tasks of robot navigation. In spite of calibration, there are still quite a few noises such as hollows and speckle burrs in both of them caused by possibly external complex environments. This paper provides a data fusion method for 2D LiDAR and RGB-D depth images. The proposed method utilizes the phenomenon that the data provided by 2D LiDAR and RGB-D are significantly different in format but tightly relative in their depth information. With time alignment and correlation analysis, we find that the lines of 2D LiDAR are able to register to the RGB-D images in height, and conversely, the corresponding lines of RGB-D depth in height could be registered to the range of 2D LiDAR curves in width automatically, even there is no calibration of them. In experiments, we evaluate the proposed method on the Robot@Home dataset: a widely recognized in-doors open robot navigation database. The results show that the proposed method contributes to de-noise of the original data both for 2D LiDAR and RGB-D depth image simultaneously. The proposed method is also validated on the realistic navigation environments, and it could be extended to the application of more precise 2D map construction for robot navigation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call