Abstract

The ultimate goal of no-reference omnidirectional image quality assessment (NR-OIQA) is to design a comprehensive perception method that can accurately assess the quality of damaged omnidirectional images without prior knowledge. However, most existing studies cannot attain credible accuracy because of lacking a non-neuroscience-based or non-biology-based model. Inspired by this, an original visual perception-based and neuroscience-based OIQA model by considering the hierarchical perception features of the HVS, which includes specific information, local saliency information, global information, and color information which is often ignored by researchers is proposed in this work. According to the hierarchical process in neuroscience, the high-frequency co-occurrence matrix (HFCM)-based and variance-based specific features are applied to perceive details that are first distorted in the frequency domain. The entropy-based combination of the paranormal saliency map(PSM) and superpixel segmentation with the simple linear iterative clustering (SLIC) algorithm is employed to emphasize the rich quality-aware local saliency information. The global panoramic statistical (GPS) model is utilized to express global semantics distortion as the high-level feature. Visual-aware color texture descriptor with the cross-channel local binary pattern(CCLBP) which effectively reflects the correlation and dependency of pixels between different color channels is employed to map color information. Finally, all above features have been extracted and combined with subjective scores to assess the objective quality scores by support vector regression (SVR). Experiments express that our method has more accuracy and stronger stability on CVIQD2018 and OIQA databases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call