Abstract

LiDAR-camera intrinsic and extrinsic parameters calibration is a prerequisite for LiDAR-camera sensor suite in the cooperative vehicle-infrastructure system or autonomous driving system. However, the calibration of LiDAR-camera is not trivial in some special scenes such as traffic scenes since the traditional target-based calibration methods are subject to the road condition. Moreover, the changing intrinsic parameters of the camera owing to focusing also pose unique challenges to such calibration tasks. Hence, a novel targetless method that jointly performs the intrinsic and extrinsic parameters calibration of LiDAR and camera sensor suite is proposed in this paper. The 2D and 3D features are collected from color images and point cloud respectively. The distance transform image is obtained from a 2D feature map and used as a reprojection error energy function for optimization. Alternatively, the observability of intrinsic and extrinsic parameters in our calibration system is theoretically verified, and the proposed method is tested in simulation and real-world scenes. Experiment results show that the method has better robustness and accuracy than other targetless calibration methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call