Abstract

Extrinsic calibration of imaging systems containing multi-sensors have enabled unprecedented capabilities in data fusion in the fields of computer vision and robotics throughout the past few decades. In this paper, we propose a simple mutual rotation-and-translation estimation method for a multi-sensor system containing six omnidirectional RGB cameras and a common 3D Light Detection and Ranging (LIDAR) sensor using a planar chessboard pattern. We mount the sensors on a specially designed hexagonal plate, while considering each camera-LIDAR combination as an independent multi-sensor unit. For each unit, we simultaneously capture chessboard images and their three-dimensional (3D) point data at a few different orientations. Two-dimensional (2D) chess corners are reprojected into 3D space for plane fitting. RANSAC algorithm is applied on LIDAR points before they are used for plane fitting. The mutual rotation between the camera and LIDAR is calculated by aligning the normal vectors of plane fitting results. An arbitrary point from the camera plane is projected on to the LIDAR plane and the distance is minimized to estimate the mutual translation. The accuracy of this proposed method is evaluated through scene fusion experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call