Abstract

Mobile Mapping is an efficient technology to acquire spatial data of the environment. The spatial data is fundamental for applications in crisis management, civil engineering or autonomous driving. The extrinsic calibration of the Mobile Mapping System is a decisive factor that affects the quality of the spatial data. Many existing extrinsic calibration approaches require the use of artificial targets in a time-consuming calibration procedure. Moreover, they are usually designed for a specific combination of sensors and are, thus, not universally applicable. We introduce a novel extrinsic self-calibration algorithm, which is fully automatic and completely data-driven. The fundamental assumption of the self-calibration is that the calibration parameters are estimated the best when the derived point cloud represents the real physical circumstances the best. The cost function we use to evaluate this is based on geometric features which rely on the 3D structure tensor derived from the local neighborhood of each point. We compare different cost functions based on geometric features and a cost function based on the Rényi quadratic entropy to evaluate the suitability for the self-calibration. Furthermore, we perform tests of the self-calibration on synthetic and two different real datasets. The real datasets differ in terms of the environment, the scale and the utilized sensors. We show that the self-calibration is able to extrinsically calibrate Mobile Mapping Systems with different combinations of mapping and pose estimation sensors such as a 2D laser scanner to a Motion Capture System and a 3D laser scanner to a stereo camera and ORB-SLAM2. For the first dataset, the parameters estimated by our self-calibration lead to a more accurate point cloud than two comparative approaches. For the second dataset, which has been acquired via a vehicle-based mobile mapping, our self-calibration achieves comparable results to a manually refined reference calibration, while it is universally applicable and fully automated.

Highlights

  • Mobile Mapping is an efficient technology to acquire spatial data of the environment

  • The quality of the 3D point cloud captured with a Mobile Mapping System (MMS) is limited by the accuracy of the mapping sensor itself and mainly three more components: The estimation of the pose of the MMS, the intrinsic calibration of the individual sensors and the extrinsic calibration of the MMS

  • The quasi-rigorous approach has successfully been used for Unmanned Areal Vehicles (UAVs)-based mobile mapping [27]. As it is important for the interpretation of the results from our real data experiments in Section 4.2.2, we briefly summarize some details of this work: For a typical UAV-based mobile mapping, it is possible to robustly estimate five of the six parameters of the calibration solely based on the flight data

Read more

Summary

Introduction

Mobile Mapping is an efficient technology to acquire spatial data of the environment. Depending on the scale of the environment, mobile platforms like airplanes, cars, Unmanned Areal Vehicles (UAVs) or mapping backpacks are considered. The mobile platform is typically equipped with one or more mapping sensors. A widely utilized mapping sensor is a laser scanner, known as Light Detection And Ranging (LiDAR) sensor, since it acquires accurate and dense spatial data of the environment in form of a 3D point cloud. The quality of the 3D point cloud captured with a Mobile Mapping System (MMS) is limited by the accuracy of the mapping sensor itself and mainly three more components: The estimation of the pose of the MMS, the intrinsic calibration of the individual sensors and the extrinsic calibration of the MMS

Objectives
Methods
Findings
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.