Abstract

This paper present a novel method to calibrate a sensor fusion system including a camera and a laser scanner. The method employs Nelder-Mead direct search algorithm to minimise the sum of squared errors between the image co-ordinates and the re-projected laser data by iteratively adjusting and improving the calibration parameters. Using 11 well spaced target points observable by each sensor 12 unknown intrinsic and extrinsic calibration parameters can be estimated. Experiments show the method can project the laser points onto the image plane with an average error of 0.52px.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call