Abstract

Assessing the visual quality of super-resolution images (SRIs) is crucial for advancing algorithm development, but it remains an unsolved problem. In this paper, we present a novel reduced-reference image quality assessment (RR-IQA) method specifically suited for evaluating SRIs. Our approach leverages information from the input low-resolution (LR) image as a reference signal to extract features that are most relevant to modeling the visual quality of SRIs. We analyze the artifact characteristics of SRIs and demonstrate that features describing edge orientations, high frequency components, and textures are the most important for this task. To extract these features, we first perform structure–texture decompositions (STD) on both the SRI and its LR input, then obtain the edge orientation feature through a traditional hand-crafted approach, and deep neural networks to extract features related to high frequency components and textures. We employ a shallow multilayer perceptron (MLP) to predict an image quality score based on these quality-relevant features. To improve feature representation ability and prevent overfitting, we pretrain the feature extraction module using a large number of unlabeled samples, which accounts for over 99.8% of the total model parameters. We use precious samples with mean opinion score (MOS) labels to train a high-quality shallow MLP predictor. Our experimental results show that the proposed method outperforms classical and state-of-the-art models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.