Abstract
First-person videos (FPVs) recorded by wearable cameras have different characteristics compared to mobile videos. Video frames in FPVs are subject to blur, rotation, shear and fisheye distortions. We design a subjective test that uses actual captured images with real distortions, synthetic distortions or a combination of both. Results indicate shear is less sensitive to content than rotation. For fisheye, personal preference and content dependence affect the subjective results. The performance of 7 noreference (NR) quality estimators (QEs) and our QE, local visual information (LVI) [1], are evaluated based on subjective results. We propose two mapping functions for rotation and shear that improve the ability of LVI and 4 NR QEs to accurately predict the subjective scores.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.