Abstract

Along with the development of virtual reality (VR), omnidirectional images play an important role in producing multimedia content with an immersive experience. However, despite various existing approaches for omnidirectional image stitching, how to quantitatively assess the quality of stitched images is still insufficiently explored. To address this problem, we first establish a novel omnidirectional image dataset containing stitched images as well as dual-fisheye images captured from standard quarters of 0$^\circ$, 90$^\circ$, 180$^\circ$, and 270$^\circ$. In this manner, when evaluating the quality of an image stitched from a pair of fisheye images (\eg, 0$^\circ$ and 180$^\circ$), the other pair of fisheye images (\eg, 90$^\circ$ and 270$^\circ$) can be used as the cross-reference to provide ground-truth observations of the stitching regions. Based on this dataset, we propose a set of Omnidirectional Stitching Image Quality Assessment (OS-IQA) metrics. In these metrics, the stitching regions are assessed by exploring the local relationships between the stitched image and its cross-reference with histogram statistics, perceptual hash and sparse reconstruction, while the whole stitched images are assessed by the global indicators of color difference and fitness of blind zones.Qualitative and quantitative experiments show our method outperforms the classic IQA metrics and is highly consistent with human subjective evaluations. To the best of our knowledge, it is the first attempt that assesses the stitching quality of omnidirectional images by using cross-references.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.