Abstract

This paper aims to develop an automatic miscalibration detection and correction framework to maintain accurate calibration of LiDAR and camera for autonomous vehicle after the sensor drift. First, a monitoring algorithm that can continuously detect the miscalibration in each frame is designed, leveraging the rotational motion each individual sensor observes. Then, as sensor drift occurs, the projection constraints between visual feature points and LiDAR 3-D points are used to compute the scaled camera motion, which is further utilized to align the drifted LiDAR scan with the camera image. Finally, the proposed method is sufficiently compared with two representative approaches in the online experiments with varying levels of random drift, then the method is further extended to the offline calibration experiment and is demonstrated by a comparison with two existing benchmark methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.