Abstract
Visual comfort assessment (VCA) for stereoscopic three-dimensional (S3D) images is a challenging problem in the community of 3D quality of experience (3D-QoE). The goal of VCA is to automatically predict the degree of perceived visual discomfort in line with subjective judgment. The challenges of VCA typically lie in the following two aspects: 1) formulating effective visual comfort-aware features, and 2) finding an appropriate way to pool them into an overall visual comfort score. In this paper, a novel two-stage framework is proposed to address these problems. In the first stage, primary predictive feature (PPF) and advanced predictive feature (APF) are separately extracted and then integrated to reflect the perceived visual discomfort for 3D viewing. Specifically, we compute the S3D visual attention-weighted disparity statistics and neural activities of the middle temporal (MT) area in human brain to construct the PPF and APF, respectively. Followed by the first stage, the integrated visual comfort-aware features are fused with a single visual comfort score by using random forest (RF) regression, mapping from a high-dimensional feature space into a low-dimensional quality (visual comfort) space. Comparison results with five state-of-the-art relevant models on a standard benchmark database confirm the superior performance of our proposed method.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have