Abstract

Though blind image quality assessment (BIQA) is highly demanded for many image processing systems, it is extremely difficult for BIQA to accurately predict the quality without the guide of the reference image. In this paper, we introduce a novel BIQA method with hierarchical feature degradation (HFD). Since the human brain presents hierarchical procedure for visual recognition, we suggest that different levels of distortion generate different degradations on hierarchical features, and propose to consider the degradations on both the low and high level features for quality assessment. Inspired by the orientation selectivity (OS) mechanism in the primary visual cortex, an OS based local visual structure is designed for low-level visual content extraction. Meanwhile, according to the feature integration function of deep neural networks, the deep semantics is extracted with the residual network for high-level visual content representation. Next, by analyzing the degradation on both the local structure and the deep semantics, a HFD based memory (prior knowledge) is learned to represent the generalized quality degradation. Finally, with the guidance of the HFD based memory, a novel HFD-BIQA model is built. Experimental results on the publicly available databases demonstrate the quality prediction accuracy of the proposed HFD-BIQA, and verify that the HFD-BIQA performs highly consistent with the subjective perception1.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call