Abstract
The 3-D LiDAR scanner and the 2-D charge-coupled device (CCD) camera are two typical types of sensors for surrounding-environment perceiving in robotics or autonomous driving. Commonly, they are jointly used to improve perception accuracy by simultaneously recording the distances of surrounding objects, as well as the color and shape information. In this paper, we use the correspondence between a 3-D LiDAR scanner and a CCD camera to rearrange the captured LiDAR point cloud into a dense depth map, in which each 3-D point corresponds to a pixel at the same location in the RGB image. In this paper, we assume that the LiDAR scanner and the CCD camera are accurately calibrated and synchronized beforehand so that each 3-D LiDAR point cloud is aligned with its corresponding RGB image. Each frame of the LiDAR point cloud is then projected onto the RGB image plane to form a sparse depth map. Then, a self-adaptive method is proposed to upsample the sparse depth map into a dense depth map, in which the RGB image and the anisotropic diffusion tensor are exploited to guide upsampling by reinforcing the RGB-depth compactness. Finally, convex optimization is applied on the dense depth map for global enhancement. Experiments on the KITTI and Middlebury data sets demonstrate that the proposed method outperforms several other relevant state-of-the-art methods in terms of visual comparison and root-mean-square error measurement.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Intelligent Transportation Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.