Abstract

There is a growing need for 3D colored maps acquired from multi-sensor moving platforms. Accurate multi-sensor data alignment is an important prerequisite for the construction of 3D colored maps derived from simultaneously acquired camera and Light Detection and Ranging (LiDAR) data. However, current alignment methods are hampered by low automation, heavy computational costs or tedious system calibration set-ups. In this paper, we consider a LiDAR–global navigation satellite system (GNSS)/inertial navigation system (INS)–camera system mounted on an Unmanned Aerial Vehicle (UAV) platform. We present a detailed literature review of existing calibration methods for such systems. We propose a new versatile automatic and targetless calibration method of this system. This method involves estimating the calibration parameters by optimizing the correspondence between pairs of conjugate image points extracted from overlapping images and the projection of these points onto the georeferenced LiDAR point cloud. Experiments on actual data show the suitability of this method for the construction of 3D colored point clouds. Quantitative calibration results using checkpoints indicate that the obtained calibration accuracy is compatible with the accuracy of the georeferenced LiDAR point cloud, i.e. 5 cm. Further experiments on simulated data show the robustness of this approach to initial calibration parameters and low sensitivity to LiDAR point cloud density and noise. As this method is quite flexible, we believe it is more suitable for 3D color map generation than other methods proposed in the literature.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call