Abstract

Pixel-wise image quality assessment (IQA) algorithms, such as mean square error (MSE), mean absolute error (MAE) and peak signal-to-noise ratio (PSNR) correlate well with perceptual quality when dealing with images sharing the same distortion type but not well when processing images in different distortion types, which is inconsistent with human visual system (HVS). Although a large number of metrics based on image error has been proposed, there are still difficulties and limitations. To solve this problem, a full reference image quality assessment (FR-IQA) method based on MAE is proposed in this paper. The metric divides the image error (difference between distorted image and reference image) map into smooth region and texture-edge region, calculates their mean values respectively, and then gives them different weights considering the masking effect. The key innovation of this paper is to propose a distortion significance measurement, which is a visual quality coefficient that can effectively indicate the influence of different distortion types on perceptual quality and unify them with HVS. The segmented image error maps are weighted by the distortion significance coefficient. The experimental results on four largest benchmark databases show that the most of the distortions are successfully evaluated and the results are consistent with HVS.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.