Abstract

Extrinsic calibration of a camera and a 2D laser range finder (lidar) sensors is crucial in sensor data fusion applications; for example SLAM algorithms used in mobile robot platforms. The fundamental challenge of extrinsic calibration is when the camera-lidar sensors do not overlap or share the same field of view. In this paper we propose a novel and flexible approach for the extrinsic calibration of a camera-lidar system without overlap, which can be used for robotic platform self-calibration. The approach is based on the robot–world hand–eye calibration (RWHE) problem; proven to have efficient and accurate solutions. First, the system was mapped to the RWHE calibration problem modeled as the linear relationship , where and are unknown calibration matrices. Then, we computed the transformation matrix , which was the main challenge in the above mapping. The computation is based on reasonable assumptions about geometric structure in the calibration environment. The reliability and accuracy of the proposed approach is compared to a state-of-the-art method in extrinsic 2D lidar to camera calibration. Experimental results from real datasets indicate that the proposed approach provides better results with an L2 norm translational and rotational deviations of 314 mm and respectively.

Highlights

  • Accurate extrinsic calibration between different sensors is an important task in many automation applications

  • We briefly review some of the state-of-the-art approaches in extrinsic calibration of lidar–camera sensors

  • We assume a pin-hole camera model for the cameras with radial and tangential lens distortion as described in [35]. (It is assumed that, for configurations 1 and 2 in Figure 2, the individual intrinsic camera calibration and the remaining extrinsic transformation, say between the left and right camera or between the left camera and the PTZ camera sensors, to be computed using the standard stereo camera calibration [30,46] when needed.) All of the results shown in this paper were generated on a workstation with two quad-core processors and 8 GB of RAM

Read more

Summary

Introduction

Accurate extrinsic calibration between different sensors is an important task in many automation applications. (One might argue that the RWHE formulation seems to complicate the extrinsic calibration problem that was originally compared; we demonstrate that this is not a problem as shall be seen throughout the paper.) Third, such a formulation does not constrain the extrinsic calibration setup of the multi-sensor systems of a 2D lidar and camera pair to have an overlapping field of view. This allows for many possible calibration configurations of the underlying sensors.

Recent Work and Contributions
Targetless
Target-Based
Our Contributions
Robot–World Hand–Eye Calibration Problem
The Proposed Calibration Procedure
Camera–Lidar Configurations
Calibration Environment
The Computation of the RF B F Matrix
Solving the Extrinsic Calibration Parameters
Verifying Accuracy
Experimental Results
Calibration Results for Configurations 1 and 3
Calibration Results for Configurations 2 and 4
Comparison with Zhang and Pless Method
Uniqueness of the Proposed Approach
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call