Abstract

Extrinsic calibration of a camera and a LiDAR is necessary to fuse information from each sensor. The real trajectory of the LiDAR is not visible on an image, therefore the accuracy of the extrinsic calibration is usually checked by evaluating residuals of constraints. In this paper, we present an improved extrinsic calibration algorithm of a camera and a 2D LiDAR using an additional dummy camera removing IR cut filter, which make it possible to observe the real trajectory of LiDAR. Some previous algorithms used the real trajectory of LiDAR for the extrinsic calibration. However, they used IR filter directly on the calibrating camera by adjusting exposure time, which can affect the result of the extrinsic calibration. We use an initial solution using the Hu algorithm which makes extrinsic calibration possible by using just one shot of data. The Hu algorithm gives a sensitive result according to pose variation between a system consisted of a camera and a LiDAR and a calibration structure, which is verified using the real trajectory of LiDAR. We cope with this problem by refining the initial solution through nonlinear minimization in a 3D space using the real trajectory of LiDAR. Experimental results show that the proposed algorithm gives an improved solution.

Highlights

  • Extrinsic calibration between a camera and a LiDAR is a prerequisite step to fuse information from each sensor

  • We propose an algorithm to improve the performance of the Hu algorithm [12] by using an additional dummy camera, which removed the infrared cut filter to observe the actual trajectory of the LiDAR on an image

  • We show that the Hu algorithm [12] gives a sensitive calibration result according to the variation of pose between a system consisted of a camera and a LiDAR and a calibration structure

Read more

Summary

Introduction

Extrinsic calibration between a camera and a LiDAR is a prerequisite step to fuse information from each sensor. It is possible to represent the information of two sensors under the same coordinate system. The sensor fusion between a camera and a LiDAR is done by projecting LiDAR data onto the corresponding image point, through it, it is possible to combine 3D information by the LiDAR and intensity information by the camera. Correspondences to the same points must be established, which is possible using the information from the extrinsic calibration. The trajectory of LiDAR data is not visible on an image because LiDAR use a light source in an infrared band. For this reason, the evaluation of extrinsic calibration of the camera and the LiDAR is usually done

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.