Abstract
Being able to automatically predict digital picture quality, as perceived by human observers, has become important in many applications where humans are the ultimate consumers of displayed visual information. Standard dynamic range (SDR) images provide 8 bits/color/pixel. High dynamic range (HDR) images which are usually created from multiple exposures of the same scene, can provide 16 or 32 bits/color/pixel, but must be tonemapped to SDR for display on standard monitors. Multi-exposure fusion (MEF) techniques bypass HDR creation, by fusing the exposure stack directly to SDR format while aiming for aesthetically pleasing luminance and color distributions. Here we describe a new no-reference image quality assessment (NR IQA) model for HDR pictures that is based on standard measurements of the bandpass and on newly-conceived differential natural scene statistics (NSS) of HDR pictures. We derive an algorithm from the model which we call the Gradient Image Quality Assessment algorithm (G-IQA). NSS models have previously been used to devise NR IQA models that effectively predict the subjective quality of SDR images, but they perform significantly worse on tonemapped HDR content. Towards ameliorating this we make here the following contributions: (1) We design a HDR picture NR IQA model and algorithm using both standard space-domain NSS features as well as novel HDR-specific gradient based features that significantly elevate prediction performance, (2) We validate the proposed models on a large-scale crowdsourced HDR image database, and (3) We demonstrate that the proposed model also perform well on legacy natural SDR images. The software is available at: http://signal.ece.utexas.edu/%7Edebarati/higradeRelease.zip.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.