Abstract

Fusion of heterogeneous exteroceptive sensors is the most efficient and effective way to represent the environment precisely, as it overcomes various defects of each homogeneous sensor. The rigid transformation (aka. extrinsic parameters) of heterogeneous sensory systems should be available before precisely fusing the multisensor information. Researchers have proposed several approaches to estimate the extrinsic parameters. However, these approaches require either auxiliary objects, like chessboards, or extra help from human to select correspondences. In this paper, we propose a novel extrinsic calibration approach for the extrinsic calibration of range and image sensors which has no requirement of auxiliary objects or any human intervention. In this paper, we firstly estimate the initial extrinsic parameters from the individual motion of the 3D range finder and the camera. And then extract lines in the image and point-cloud pairs and match lines by the initial extrinsic parameters. Finally, we refine the extrinsic parameters by line feature associations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call