Abstract

Abstract. In this paper, a new method for fusing optical and laserscanner data is presented for improved UAV-borne 3D mapping. We propose to equip an unmanned aerial vehicle (UAV) with a small platform which includes two sensors: a standard low-cost digital camera and a lightweight Hokuyo UTM-30LX-EW laserscanning device (210 g without cable). Initially, a calibration is carried out for the utilized devices. This involves a geometric camera calibration and the estimation of the position and orientation offset between the two sensors by lever-arm and bore-sight calibration. Subsequently, a feature tracking is performed through the image sequence by considering extracted interest points as well as the projected 3D laser points. These 2D results are fused with the measured laser distances and fed into a bundle adjustment in order to obtain a Simultaneous Localization and Mapping (SLAM). It is demonstrated that an improvement in terms of precision for the pose estimation is derived by fusing optical and laserscanner data.

Highlights

  • Nowadays unmanned aerial vehicles (UAVs) are promising platforms for capturing spatial information

  • Simultaneous Localization and Mapping (SLAM) has to be conducted as specified by Durrant-Whyte & Bailey (2006): ‘SLAM is a process by which a mobile robot can build a map of an environment and at the same time use this map to deduce its location

  • For performing SLAM with optical sensors, a successful and precise localization of the unmanned aircraft system (UAS) and a simultaneous 3D mapping of the environment can be gained by sensing distinctive elements of the environment, referred to as landmarks

Read more

Summary

INTRODUCTION

Nowadays unmanned aerial vehicles (UAVs) are promising platforms for capturing spatial information. For performing SLAM with optical sensors, a successful and precise localization of the unmanned aircraft system (UAS) and a simultaneous 3D mapping of the environment can be gained by sensing distinctive elements of the environment, referred to as landmarks. For these 3D landmarks usually no prior knowledge about their location is given and the 3D position of the landmarks has to be estimated by utilizing descriptive 2D image features from various observations as accurate as possible.

METHODOLOGY
System calibration
Online processing
Digital camera - Canon Digital IXUS 100 IS
Laserscanner device - Hokuyo UTM-30LX-EW
SYSTEM CALIBRATION
Geometric camera calibration
Lever-arm and bore-sight calibration
ONLINE PROCESSING
Evaluation Results
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call