Abstract
As digital images have become a significantly primary medium in a broad area, there is a growing interest in the development of automatic objective image quality assessment (IQA) methods. In this paper, a novel no-reference IQA (NRIQA) algorithm is proposed based on independent component analysis and convolutional neural network. The proposed NRIQA algorithm consists of the following three steps: selection of some representative patches, extraction of the features of the selected image patches, and prediction of the image quality by exploiting the features. Initially, an image is divided into non-overlapping patches and then some patches are selected with the suitable property for assessing the overall image quality. In this paper, we refer to the selected patches as image quality patches. The largest infinity norm of the gradient of each image quality patch is employed as a basis when the image quality patches being selected. Second, we employ independent component analysis (ICA) to extract the features of image quality patches. At the last moment, a convolutional neural network (CNN) is applied to the independent component coefficients of image quality patches to predict the corresponding differential mean opinion score (DMOS). We compared the performance of the proposed NQIRM with other IQMs in terms of PCC, SROCC, and RMSE on the database LIVE2, CSIQ and TID2008/2013. The PCC, SROCC and RMSE values achieve respectively to 0.996, 0.999 and 6.011 on the database TID2013. The performance comparison results show the proposed NRIQM is superior to commonly used IQMs.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Electrical Engineering & Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.