Abstract
Abstract Camera-LiDAR calibration is a crucial aspect of perception systems and sensor fusion in various fields, facilitating the fusion of data from cameras and LiDAR sensors for applications such as autonomous vehicles, robotics, and augmented reality. In this paper, we presented a novel multi-modal fusion method that introduces an efficient camera-LiDAR joint calibration technique using a simple checkerboard. Our method estimates the 3D rigid transformation of the camera system with respect to the LiDAR frame by establishing 2D-to-3D point correspondences for geometric calibration. We improved the idea of the traditional PNP (Perspective-N-Point) algorithm by employing point cloud segmentation and clustering to re-match the checkerboard. When compared with the calibration methods in the Autoware and MATLAB calibration toolbox, the reprojection error was improved by 38.13\% and 58.30\% respectively. The average translation error was improved by 78.26\% and 51.61\% respectively, while the average rotation error is improved by 75\% and 60.78\% respectively, which has a great improvement in all three areas. At the same time, we propose a new evaluation metric for joint calibration results based on reprojected images: PLE (Pixel-Level Evaluation) index to reflect the accuracy of the joint calibration of the camera and LiDAR. An automatic calibration software has been developed for the calibration of the camera's intrinsic parameters as well as for the joint calibration of the camera-LIDAR extrinsic parameters. We have extensively validated our algorithm using the Intel RealSense Depth D435 Camera and LeiShen C16-151B LiDAR.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.