Abstract
Nowadays, portable cameras and LiDARs are widely applied in robotic applications for environment perception, path planning, and high precision navigation. In such cases, accurate inter-sensor spatial transformation is critical to fuse these sensors seamlessly. However, the feasible and efficient extrinsic calibration of the LiDAR-camera sensor pair is still challenging as it is hard to establish the common feature correspondence between sparse LiDAR point clouds and monocular images. In this contribution, we design a novel calibration board with checkerboard grids and circular holes, through which the proper extrinsic parameters can be obtained automatically by matching the circular hole centers extracted from both images and heterogenous LiDAR scans. Profiting from the versatility and stability of circular hole extraction, the proposed calibration method is suitable for LiDARs with different precision and scanning modes, such as Velodyne LiDAR, Ouster LiDAR, and Livox LiDAR. Both simulation tests and real-world experiments were conducted to evaluate the effectiveness of the proposed calibration method. The simulated experiments indicate that the translation error is less than 0.5 cm, and the rotation error is well below 0.3 degrees. As for real-world experiments, the re-projection error can achieve sub-pixel level with different LiDARs.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have