Abstract

This paper introduces an intelligent method to calibrate radar and camera sensors for data fusing. Camera can recognize pedestrians, non-motor vehicles, motor vehicles high accuracy by advanced deep learning neural networks. But it’s hard to acquire the target’s spatial position and speed information simply relying on image. For the radar sensor, it’s easy to acquire the target’s spatial position and speed information by itself, but it cannot recognize target. In other words, the advantage of radar is the disadvantage of camera, and the advantage of camera is the disadvantage of radar. So combining these two complementary sensors is valid and necessary for solving the problem: Who is doing what. Then the first and foremost thing is to calibrate the radar and camera sensors. Calibration includes time synchronization and space calibration, which will seriously affect the sensors data fusing performance. This paper emphasizes on spatial calibration. Traditional methods (based on the principle of geometric projection or four-point calibration method) cannot acquire enough high calibration accuracy in the general scenarios with low time cost or easy operation. So this paper introduces an online intelligent method to calibrate radar and camera sensor without human assistance operation in the general scenarios. Firstly, we introduce what is spatial calibration. Secondly, we introduce two conventional methods and their defects. And then we introduce the online intelligent calibration method. At last we compare three calibration methods with the actual data to further explain the advance of our new method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.