Abstract

Abstract. Unmanned aerial vehicles (UAVs) have been widely used for 3D reconstruction/modelling in various applications such as precision agriculture, coastal monitoring, and emergency management. For such mapping applications, camera and LiDAR are the two most commonly used sensors. Mapping with imagery-based approaches is considered to be an economical and effective option and is often conducted using Structure from Motion (SfM) techniques where point clouds and orthophotos are generated. In addition to UAV photogrammetry, point clouds of the area of interest can also be directly derived from LiDAR sensors onboard UAVs equipped with global navigation satellite systems/inertial navigation systems (GNSS/INS). In this study, a custom-built UAV-based mobile mapping system is used to simultaneously collect imagery and LiDAR data. Derived LiDAR and image-based point clouds are investigated and compared in terms of their absolute and relative accuracy. Furthermore, stability of the system calibration parameters for the camera and LiDAR sensors are studied using temporal datasets. The results show that while LiDAR point clouds demonstrate a high absolute accuracy over time, image-based point clouds are not as accurate as LiDAR due to instability of the camera interior orientation parameters.

Highlights

  • Unmanned aerial vehicles (UAVs) equipped with global navigation satellite systems/inertial navigation systems (GNSS/INS) are becoming more popular for many applications because of their capability to carry advanced sensors and collect both high temporal and high spatial resolution data

  • Similar to the results shown by Elsner et al (2018), image-based point cloud showed a constant positive elevation bias from 4 to 9 cm when compared to the LiDAR surfaces and Real-Time Kinematic (RTK)-GNSS measurements

  • Large standard deviation (STD) values in X and Y directions show a horizontal misalignment of the image-based point cloud

Read more

Summary

Introduction

Unmanned aerial vehicles (UAVs) equipped with global navigation satellite systems/inertial navigation systems (GNSS/INS) are becoming more popular for many applications because of their capability to carry advanced sensors and collect both high temporal and high spatial resolution data. UAV-based systems can provide accurate 3D spatial information at a relatively low cost, and facilitate various applications including precision agriculture (Moghimi et al, 2020; Ravi et al, 2019; Masjedi et al, 2018; He et al, 2018; Habib et al, 2016), infrastructure monitoring (Greenwood et al, 2019), and archaeological documentation (Lin et al, 2019; Hamilton, Stephenson, 2016). The reconstructed image-based 3D model can be georeferenced using either ground control points (GCPs), known as indirect georeferencing, or trajectory information provided by a surveygrade GNSS/INS unit onboard the UAV, known as direct georeferencing. System calibration of UAV-based GNSS/INS-assisted imaging and/or ranging systems is a vital step for direct georeferencing, and reconstructing accurate LiDAR/image-based point clouds. Any deviation in the system calibration parameters from their true values will adversely affect the accuracy of reconstructed object space

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call