Enhancing Perceptual Quality Assessment for 360-Degree Images Based on Adaptive Patch Labeling and Multi-Label Learning
This paper delves into the intricate field of perceptual quality assessment specifically tailored for 360-degree images, aiming to advance the precision of quality models. In contrast to conventional methodologies that associate different regions within images to mean opinion scores (MOS), our study introduces a paradigm shift. We propose a novel approach where the model is trained to predict multi-labels derived from subjective and objective measures, leveraging a sophisticated quality labeling framework designed to capture nuanced perceptual distinctions across diverse regions in panoramic content. This allows for a flexible and stable training process. In addition, a loss function taking into account the magnitude and direction of quality labels under a multi-label learning scheme is designed, using absolute and directional distances as loss functions. Experimental results underscore the limitations of relying solely on MOS as unique labels. The efficiency of our approach becomes evident through improved performance, showcasing its potential for advancing the precision of perceptual quality assessment models.