Abstract

Blind video quality assessment of user-generated content (UGC) has become a trending and challenging problem. Previous studies have shown the efficacy of natural scene statistics for capturing spatial distortions. The exploration of temporal video statistics on UGC, however, is relatively limited. Here we propose the first general, effective and efficient temporal statistics model accounting for temporal- or motion-related distortions for UGC video quality assessment, by analyzing regularities in the temporal bandpass domain. The proposed temporal model can serve as a plug-in module to boost existing no-reference video quality predictors that lack motion-relevant features. Our experimental results on recent large-scale UGC video databases show that the proposed model can significantly improve the performances of existing methods, at a very reasonable computational expense.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call