Abstract
Fusion of heterogeneous exteroceptive sensors is the most efficient and effective way to represent the environment precisely, as it overcomes various defects of each homogeneous sensor. The rigid transformation (aka. extrinsic parameters) of heterogeneous sensory systems should be available before precisely fusing the multisensor information. Researchers have proposed several approaches to estimate the extrinsic parameters. However, these approaches require either auxiliary objects, like chessboards, or extra help from human to select correspondences. In this paper, we propose a novel extrinsic calibration approach for the extrinsic calibration of range and image sensors which has no requirement of auxiliary objects or any human intervention. In this paper, we firstly estimate the initial extrinsic parameters from the individual motion of the 3D range finder and the camera. And then extract lines in the image and point-cloud pairs and match lines by the initial extrinsic parameters. Finally, we refine the extrinsic parameters by line feature associations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.