Abstract

Pan-sharpening (PS) is a method of fusing the spatial details of a high-resolution panchromatic (PAN) image with the spectral information of a low-resolution multi-spectral (MS) image. Visual inspection is a crucial step in the evaluation of fused products whose subjectivity renders the assessment of pansharpened data a challenging problem. Most previous research on the development of PS algorithms has only superficially addressed the issue of qualitative evaluation, generally by depicting visual representations of the fused images. Hence, it is highly desirable to be able to predict pan-sharpened image quality automatically and accurately, as it would be perceived and reported by human viewers. Such a method is indispensable for the correct evaluation of PS techniques that produce images for visual applications such as Google Earth and Microsoft Bing. Here, we propose a new image quality assessment (IQA) measure that supports the visual qualitative analysis of pansharpened outcomes by using the statistics of natural images, commonly referred to as natural scene statistics (NSS), to extract statistical regularities from PS images. Importantly, NSS are measurably modified by the presence of distortions. We analyze six PS methods in the presence of two common distortions, blur and white noise, on PAN images. Furthermore, we conducted a human study on the subjective quality of pristine and degraded PS images and created a completely blind (opinion-unaware) fused image quality analyzer. In addition, we propose an opinion-aware fused image quality analyzer, whose predictions with respect to human perceptual evaluations of pansharpened images are highly correlated.

Highlights

  • Pan-sharpening (PS) is a conventional approach for integrating the spatial details of a high-resolution panchromatic (PAN) image and the spectral information of a low-resolution multi-spectral (MS) image to produce a high-resolution MS image [1]

  • We calculated the geometric mean of the resulting differential mean opinion scores (DMOS) (DMOSGM = DMOSTC DMOSPC) obtained by evaluating the true and pseudo-color versions of the PS images in order to generate one score to be mapped by the SVR

  • To rank the PS techniques, we used the number of times a PS technique was placed in the top three ranks according to a given set of metrics (i.e., reduced resolution (RRes), RRes, PS image quality assessment (IQA) analyzers, or DMOS) to determine its classification as low, medium, or high performance

Read more

Summary

Introduction

Pan-sharpening (PS) is a conventional approach for integrating the spatial details of a high-resolution panchromatic (PAN) image and the spectral information of a low-resolution multi-spectral (MS) image (both simultaneously obtained over the same region) to produce a high-resolution MS image [1]. The fused images obtained are known to be spatially and spectrally enhanced compared to the MS and the PAN images, respectively. CS approaches substitute the spatial information contained within an original MS image with spatial details contained in a PAN image This substitution can yield visually-appealing PS images that are robust against small misregistration errors. MRA methods extract PAN details via spatial filtering while preserving spectral information, yielding outcomes that are robust with respect to temporal misalignments [5]. Modern approaches have been recently developed that advance the performance of classical methods. These techniques reformulate PS as an inverse problem, where the goal is to obtain a high-resolution MS image from low-resolution MS and PAN measurements. Algorithms may introduce spatial distortions and spectral distortions that can adversely affect the quality of pan-sharpened images

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.