Abstract

Abstract. Images acquired by airborne sensors exhibit blur due to turbulent motion experienced by the aircraft. Significant amount of blur renders the images unusable for subsequent visual/automated analysis, requiring a re-flight. This necessitates quantifying the amount of blur in an image. Most approaches to blur quantification use image based methods to estimate the MTF (Modular Transfer Function) that indicates the amount of blur. Their limitations are – (1) MTF calculation requires presence of straight edges in the image scene, which may not always be available, (2) Due to the absence of an ideal edge, the amount of blur estimated is relative, and (3) It is computationally expensive and therefore may not be practical for blur detection in real time applications. Our solution uses the sensor motion as measured by an Inertial Measurement Unit (IMU) mounted in the camera system to calculate the motion experienced by the aircraft and the sensor during the time the shutter was actually open. This motion information together with the blur detection algorithm presented in this paper can provide an accurate quantification of blur in pixel units. Once we identify the images that exceed a given blur threshold, we use a blur removal algorithm that uses the IMU data and a natural image prior to compute per-pixel, spatially-varying blur, and then de-convolves an image to produce de-blurred images. The presented blur detection and removal methods are currently being used offline to quantify and remove the blur from images acquired by the UltraCam systems within the Global Ortho Program (Walcher, 2012), which generates ortho imagery for all of the continental US as well as Western Europe, from over 2.5 million images. Furthermore, the blur detection method will be incorporated in the camera software of all our operational and forthcoming UltraCam imaging systems for real-time blur quantification.

Highlights

  • 1.1 MotivationMany services and applications provide functionality associated with images captured by cameras

  • The proposed metrics for automatic blur detection has proven to be a crucial part of our automated quality control, but has been instrumental in defining image quality specifications with regards to image blur for our customers. Both are paramount when dealing with large photogrammetric ortho projects such as the Global Ortho Program (Walcher, 2012)

  • The proposed algorithm for blur detection will be implemented into the UltraCam sensor family

Read more

Summary

Introduction

Many services and applications provide functionality associated with images captured by cameras. An online mapping service may provide users with interactive maps derived from images captured by cameras mounted on aircrafts, vehicles, and/or satellites. A web search engine may provide users with search results comprising one or more images captured by cameras. Many images acquired by cameras may suffer from blur. Blur may occur due to motion of a camera while an image is captured (e.g., a camera mounted to an aircraft may experience three-dimensional movement due to turbulent motion experience by the aircraft during image capture). Substantial blur within an image may render the image unusable for certain applications/functions

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call