Abstract

The visual quality of perceptions is highly correlated with the mechanisms of the human brain and visual system. Recently, the free-energy principle, which has been widely researched in brain theory and neuroscience, is introduced to quantize the perception, action, and learning in human brain. In the field of image quality assessment (IQA), on one hand, the free-energy principle can resort to the internal generative model to simulate the visual stimulus of the human beings. On the other hand, abundant psychological and neurobiological studies reveal that different frequency and orientation components of one visual stimulus arouse different neurons in the striate cortex, and the striate cortex processes visual information in the cerebral cortex. Motivated by these two aspects, a novel reduce-reference IQA metric called the multi-channel free-energy based reduced-reference quality metric is proposed in this paper. First, a two-level discrete Haar wavelet transform is used to decompose the input reference and distorted images. Next, to simulate the generative model in the human brain, the sparse representation is leveraged to extract the free-energy-based features in subband images. Finally, the overall quality metric is obtained through the support vector regressor. Extensive experimental comparisons on four benchmark image quality databases (LIVE, CSIQ, TID2008, and TID2013) demonstrate that the proposed method is highly competitive with the representative reduced-reference and classical full-reference models.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.