Abstract

Abstract. With respect to the usual processing chain in UAV photogrammetry the consideration of the camera’s influencing factors on the accessible accuracy level is of high interest. In most applications consumer cameras are used due to their light weight. They usually allow only for automatic zoom or restricted options in manual modes. The stability and long-term validity of the interior orientation parameters are open to question. Additionally, common aerial flights do not provide adequate images for self-calibration. Nonetheless, processing software include self-calibration based on EXIF information as a standard setting. The subsequent impact of the interior orientation parameters on the reconstruction in object space cannot be neglected. With respect to the suggested key issues different investigations on the quality of interior orientation and its impact in object space are addressed. On the one hand the investigations concentrate on the improvement in accuracy by applying pre-calibrated interior orientation parameters. On the other hand, image configurations are investigated that allow for an adequate self-calibration in UAV photogrammetry. The analyses on the interior orientation focus on the estimation quality of the interior orientation parameters by using volumetric test scenarios as well as planar pattern as they are commonly used in computer vision. This is done by using a Olympus Pen E-PM2 camera and a Canon G1X as representative system cameras. For the analysis of image configurations a simulation based approach is applied. The analyses include investigations on varying principal distance and principal point to evaluate the system’s stability.

Highlights

  • In UAV photogrammetry the interior orientation of the camera system, its stability during image acquisition and flight as well as its calibration options and consideration in the bundle adjustment, are limiting factors to the accuracy level of the processing chain

  • For all signalized points a. their coordinates in object space, estimated within a bundle adjustment, are transformed to their control point coordinates b. their within a bundle adjustment estimated coordinates in object space, including pre-calibrated interior orientation parameters, are transformed to their control point coordinates c. forward intersections for object coordinates are calculated, based on previously estimated interior and exterior orientation parameters using the overlapping images, and transformed to their control point coordinates d. forward intersections for object coordinates are calculated, based on previously estimated exterior orientation parameters using the overlapping images and pre-calibrated interior orientation parameters, and transformed to their control point coordinates

  • The results show an increase in the object space accuracy for Xand Y-direction by introducing yaw-changed images in contrast to a standard data set

Read more

Summary

INTRODUCTION

In UAV photogrammetry the interior orientation of the camera system, its stability during image acquisition and flight as well as its calibration options and consideration in the bundle adjustment, are limiting factors to the accuracy level of the processing chain. Besides the use of accuracy limiting hardware components, the application of different software packages for UAV photogrammetry might significantly influence the processing results. In most cases the application of professional photogrammetric processing software for UAV imagery is limited This would require standard image blocks due to the determination of overlapping areas for automatic processing as it is usually found within aerial photogrammetry products. Simulation based scenarios for image block configurations in field calibrations are published by Kruck & Mélykuti (2014) They notice that the simulation focuses on the determinability but not on the reliability of parameter estimation. Image configurations are investigated that allow for an adequate self-calibration in UAV photogrammetry

Mathematical models of interior orientation
Conversion of interior orientation parameters
Tested cameras
Test fields and scenarios
Parameter results
Impact in object space
FLIGHT SIMULATION
Findings
SUMMARY
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call