Abstract

Fusions of LiDARs (light detection and ranging) and cameras have been effectively and widely employed in the communities of autonomous vehicles, virtual reality and mobile mapping systems (MMS) for different purposes, such as localization, high definition map or simultaneous location and mapping. However, the extrinsic calibration between a camera and a 3D LiDAR is a fundamental prerequisite to guarantee its performance. Some previous methods are inaccurate, have calibration error that is several times the beam divergence, and often require special calibration objects, thereby limiting their ubiquitous use for calibration. To overcome these shortcomings, we propose a novel and high-accuracy method for the extrinsic calibration between a camera and a 3D LiDAR. Our approach relies on the infrared images from a camera with an infrared filter, and the 2D-3D corresponding points in a scene with the corners of a wall can be extracted to calculate the six extrinsic parameters. Experiments using the Velodyne VLP-16 sensor show that the method can achieve an extrinsic accuracy at the level of the beam divergence, which is fully analyzed and validated from two different aspects. Therefore, the calibration method in this paper is highly accurate, effective and does not require special complicated calibration objects; thus, it meets the requirements of practical applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call