Abstract

Objective Image Quality Assessment (IQA) methods often lack of linearity of their quality estimates with respect to scores expressed by human subjects and therefore IQA metrics undergo a calibration process based on subjective quality examples. However, example-based training presents a challenge in terms of generalization hampering result comparison across different applications and operative conditions. In this paper, new Full Reference (FR) techniques, providing estimates linearly correlated with human scores without using calibration are introduced. We show that on natural images, application of estimation theory and psychophysical principles to images degraded by Gaussian blur leads to a so-called canonical IQA method, whose estimates are linearly correlated to both the subjective scores and the viewing distance. Then, we show that any mainstream IQA methods can be reconducted to the canonical method by converting its metric based on a unique specimen image. The proposed scheme is extended to wide classes of degraded images, e.g. noisy and compressed images. The resulting calibration-free FR IQA methods allows for comparability and interoperability across different imaging systems and on different viewing distances. A comparison of their statistical performance with respect to state-of-the-art calibration prone methods is finally provided, showing that the presented model is a valid alternative to the final 5-parameter calibration step of IQA methods, and the two parameters of the model have a clear operational meaning and are simply determined in practical applications. The enhanced performance are achieved across multiple viewing distance databases by independently realigning the blur values associated with each distance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.