Abstract

Abstract Automatically registering 3D point clouds generated by unmanned aerial and ground vehicles (UAVs and UGVs) is challenging, as data is acquired at different locations with different sensors, consequently resulting in different spatial scales and occlusions. To address these problems, this study proposes a framework for the automated registration of UAV and UGV point clouds using 2D local feature points in the images taken from UAVs and UGVs. This study first conducted field experiments by varying the angles of the UAV camera to identify the optimal angle with which to detect sufficient points matching with the images taken by the UGV. As a result, this study identified that a combination of UAV images taken at 30° and 90° is appropriate for generating a sufficient number of matching points and attaining a reasonable level of precision. The UAV and UGV point clouds were initially scaled and registered with a transformation matrix computed from the 3D points corresponding to the 2D feature matching points. The initially aligned point clouds were subsequently adjusted by the Iterative Closest Point (ICP) algorithm, resulting in the root mean square error (RMSE) of 0.112 m. This promising result indicates that full automation of spatial data collection and registration from a scattered environment (e.g., construction or disaster sites) by UAVs and UGVs is feasible without human intervention.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.