Abstract

The last few years have witnessed an increasing volume of aerial image data because of the extensive improvements of the Unmanned Aerial Vehicles (UAVs). These newly developed UAVs have led to a wide variety of applications. A fast assessment of the achieved coverage and overlap of the acquired images of a UAV flight mission is of great help to save the time and cost of the further steps. A fast automatic stitching of the acquired images can help to visually assess the achieved coverage and overlap during the flight mission. This paper proposes an automatic image stitching approach that creates a single overview stitched image using the acquired images during a UAV flight mission along with a coverage image that represents the count of overlaps between the acquired images. The main challenge of such task is the huge number of images that are typically involved in such scenarios. A short flight mission with image acquisition frequency of one second can capture hundreds to thousands of images. The main focus of the proposed approach is to reduce the processing time of the image stitching procedure by exploiting the initial knowledge about the images positions provided by the navigation sensors. The proposed approach also avoids solving for all the transformation parameters of all the photos together to save the expected long computation time if all the parameters were considered simultaneously. After extracting the points of interest of all the involved images using Scale-Invariant Feature Transform (SIFT) algorithm, the proposed approach uses the initial image’s coordinates to build an incremental constrained Delaunay triangulation that represents the neighborhood of each image. This triangulation helps to match only the neighbor images and therefore reduces the time-consuming features matching step. The estimated relative orientation between the matched images is used to find a candidate seed image for the stitching process. The pre-estimated transformation parameters of the images are employed successively in a growing fashion to create the stitched image and the coverage image. The proposed approach is implemented and tested using the images acquired through a UAV flight mission and the achieved results are presented and discussed.

Highlights

  • Aerial images have played an essential role for a wide variety of applications such environmental monitoring, urban planning and mapping, disaster assessment

  • Different image features have been successfully utilized in image mosaicing such as Scale-Invariant Feature Transform (SIFT) (Brown and Lowe, 2007; Jia et al, 2015; Liqian and Yuehui, 2010), Harris points (Zagrouba et al, 2009), and Speeded Up Robust Features (SURF) (Geng et al, 2012; Rong et al, 2009; Wang and Watada, 2015; Xingteng et al, 2015)

  • This paper proposes a fast and suboptimal stitching approach that construct a single overview stitched image and a coverage map using aerial images

Read more

Summary

Introduction

Aerial images have played an essential role for a wide variety of applications such environmental monitoring, urban planning and mapping, disaster assessment. The assessment of the achieved coverage and overlap of the acquired images of a flight mission is of great help to ensure successful product generation, correct decision and to save the time and cost of the further steps. This assessment is crucial to mitigate problems of unexpected imaging failures and the expected large angular variations between images at low altitude flights (Zhang et al, 2011). High camera rotations between adjacent images due to wind effects or stabilising platform inaccuracies can decrease the achieved overlap (Zhang et al, 2011), miss important information, and deteriorate the final product accuracy

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.