Abstract

The reliability of the extrinsic parameters of 3D Light Detection and Ranging (LiDAR) and camera is a prerequisite to ensure the common operation of intelligent perception systems. Compared with structured light cameras, monocular or binocular cameras commonly used in intelligent perception systems cannot provide rich and high-precision environmental information. Therefore, the extrinsic parameter calibration between the structured light camera and the lidar is very necessary for the information fusion between both. Here we propose a novel method for extrinsic parameters calibration between structured light camera and 3D lidar to enhance the intelligent perception capacity. The hemispherical surface on the calibration board can provide enough effective point clouds, so that the accurate coordinates of the center of the sphere can be stably fitted. And the extrinsic parameters can be solved by considering the centers of the sphere as reference points. To demonstrate the performance of this method, we value it by rotation error and translation error, using the ground truth value obtained from the simulation environment established by gazebo. The experimental results show that our method can get more reliable and accuracy extrinsic parameters between the structured light camera and lidar comparing toward the original method when the transformation is not pure translation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call