Abstract

Most well-known blind image quality assessment (BIQA) models usually follow a two-stage framework whereby various types of features are first extracted and used as an input to a regressor. The regression algorithm is used to model human perceptual measures based on a training set of distorted images. However, this approach requires an intensive training phase to optimise the regression parameters. In this paper, we overcome this limitation by proposing an alternative BIQA model that predicts image quality using nearest neighbour methods which have virtually zero training cost. The model, termed PATCH based blind Image Quality assessment (PATCH-IQ), has a learning framework that operates at the patch level. This enables PATCH-IQ to provide not only a global image quality estimation but also a local image quality estimation. Based on the assumption that the perceived quality of a distorted image will be best predicted by features drawn from images with the same distortion class, PATCH-IQ also introduces a distortion identification stage in its framework. This enables PATCH-IQ to identify the distortion affecting the image, a property that can be useful for further local processing stages. PATCH-IQ is evaluated on the standard IQA databases, and the provided scores are highly correlated to human perception of image quality. It also delivers competitive prediction accuracy and computational performance in relationship to other state-of-the-art BIQA models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.