Abstract

Abstract. Modern mobile mapping systems include one or several laser scanners and cameras. The main outputs of these systems are oriented camera images and 3D point clouds. These point clouds can be derived from pairs of overlapping images or using the laser raw data together with platform trajectory. A mobile mapping campaign may include several overlapping areas, generally, the derived point clouds of the same area are not properly registered, due to partial or total GNSS occlusions, multipath and inertial drift and noise. Nowadays, the standard procedure for co-registration between laser and laser and between camera and laser, includes several steps. The first ones are the system calibration where the lever arm and boresight between laser and IMU, and between camera and IMU must be determined. After the calibration steps, a camera and LiDAR point cloud can be derived. Then, a co-registration between LIDAR points clouds and between camera point cloud and LiDAR point cloud are computed. In contrast to the standard approach, in this paper we propose to solve the orientation and calibration of laser and camera data in a single, combined adjustment. Solving the orientation and calibration allows us to implicitly deal with the co-registration problem. The proposed method is based on the identification of common tie features between images and point clouds and their use in a combined adjustment. This common tie features are straight line segments. The preliminary results indicate the feasibility and the potential of the approach.

Highlights

  • Nowadays, 3D georeferenced data are widely used or the primary data, for many applications such as 3D city modelling, cadastrial mapping, cultural heritage, facility management, traffic accident investigation, to mention a few examples

  • In our combined camera-laser Integrated Sensor Orientation (ISO) concept, laser raw data instead of processed point clouds are used. 3D coordinates of point-cloud points are not explicitly computed: the laser points that define the tie planes (TPL) are parameterized with raw range and scan-angle measurements

  • In order to expand this concept to mobile mapping systems, the presented models are still valid without modifications except for the MMS plane observations equations

Read more

Summary

INTRODUCTION

3D georeferenced data are widely used or the primary data, for many applications such as 3D city modelling, cadastrial mapping, cultural heritage, facility management, traffic accident investigation, to mention a few examples. For aerial laser and mobile mapping data, ISO has been proven to be effective for the IMU-laser boresight calibration with single and multiple laser scanners (Skaloud and Lichti, 2006), (Chan et al, 2013). In contrast to the standard approach, in this paper we propose to solve the orientation and calibration of laser and camera data in a single, combined adjustment. The proposed method is based on the identification of common tie features between images and point clouds and their use in a combined adjustment. This common tie features are straight line segments. The last section summarizes the conclusions of the proposed approach and discusses future improvements

PROPOSED APPROACH
MATHEMATICAL MODELS
Some naming and notation conventions
MMS plane observation equation
Line-in-plane observation equations
CONCEPT VALIDATION
CONCLUSIONS AND FURTHER RESEARCH
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call