Abstract

The use of 3D technologies is growing rapidly, and stereoscopic imaging is usually used to display the 3D contents. However, compression, transmission and other necessary treatments may reduce the quality of these images. Stereo Image Quality Assessment (SIQA) has attracted more attention to ensure good viewing experience for the users and thus several methods have been proposed in the literature with a clear improvement for deep learning-based methods. This paper introduces a new deep learning-based no-reference SIQA using cyclopean view hypothesis and human visual attention. First, the cyclopean image is constructed considering the presence of binocular rivalry that covers the asymmetric distortion case. Second, the saliency map is computed considering the depth information. The latter aims to extract patches on the most perceptual relevant regions. Finally, a modified version of the pre-trained Convolutional Neural Network (CNN) is fine-tuned and used to predict the quality score through the selected patches. Five distinct pre-trained models were analyzed and compared in term of results. The performance of the proposed metric has been evaluated on four commonly used datasets (3D LIVE phase I and phase II databases as well as Waterloo IVC 3D Phase 1 and Phase 2). Compared with the state-of-the-art metrics, the proposed method gives better outcomes. The implementation code will be made accessible to the public at: https://github.com/o-messai/3D-NR-SIQA

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.