Abstract

LiDAR and camera are two commonly used sensors in autonomous vehicles. In order to fuse the data collected by these two sensors to accurately perceive the 3D world, it is necessary to perform accurate internal parameters’ and external parameters’ calibration on the two sensors. However, during the long-term deployment and use of autonomous vehicles, factors such as aging of the equipment, transient changes in the external environment, and interference can cause the initially correctly calibrated camera internal parameters to no longer be applicable to the current environment, requiring a recalibration of the camera internal reference. Since most of the current work is focused on the research of perception algorithms and the calibration of various sensors, there has not been much research in identifying when a sensor needs to be recalibrated. Consequently, this paper proposed a data-driven detection method for the miscalibration of RGB cameras to detect the miscalibrated camera internal parameters. The specific operation process is to first add a random perturbation factor to the correctly calibrated camera internal parameters to generate an incorrect camera internal parameter and then calibrate the raw image with the incorrect internal parameter to generate a miscalibrated image data. The miscalibrated image data are used as the input data of the neural network to train the network model and generate a network model for detecting the miscalibration parameters. On the KITTI dataset, we conducted training as well as model deployment with the data collected from Cam2 and Cam3, respectively, and evaluated the abovementioned two models. The experimental results show that our proposed method has some application value in detecting errors in the calibration of the camera’s internal parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call