Abstract

This study aims to develop advanced and training-free full-reference image quality assessment (FR-IQA) models based on deep neural networks. Specifically, we investigate measures that allow us to perceptually compare deep network features and reveal their underlying factors. We find that distribution measures enjoy advanced perceptual awareness and test the Wasserstein distance (WSD), Jensen-Shannon divergence (JSD), and symmetric Kullback-Leibler divergence (SKLD) measures when comparing deep features acquired from various pretrained deep networks, including the Visual Geometry Group (VGG) network, SqueezeNet, MobileNet, and EfficientNet. The proposed FR-IQA models exhibit superior alignment with subjective human evaluations across diverse image quality assessment (IQA) datasets without training, demonstrating the advanced perceptual relevance of distribution measures when comparing deep network features. Additionally, we explore the applicability of deep distribution measures in image super-resolution enhancement tasks, highlighting their potential for guiding perceptual enhancements. The code is available on website. (https://github.com/Buka-Xing/Deep-network-based-distribution-measures-for-full-reference-image-quality-assessment).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.