Abstract
A novel method is presented here for the calibration of a sensor fusion system for intelligent vehicles. In this example, the sensors are a camera and a laser scanner which observe the same scene from different viewpoints. The method employs the Nelder–Mead direct search algorithm to minimize the sum of squared errors between the image coordinates and the re-projected laser data by iteratively adjusting and improving the calibration parameters. The method is applied to a real set of data collected from a test vehicle. Using only 11 well-spaced target points observable by each sensor, 12 intrinsic and extrinsic parameters indicating the position relationship between the sensors can be estimated to give an accurate projection. Experiments show that the method can project the laser points onto the image plane with an average error of 1.01 pixels (1.51 pixels worst case).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.