Abstract

Video stability is a significant problem for videos captured by handheld or body-worn cameras. An accurate video stability estimation that is consistent with human perception is the basis of effective video stabilization algorithms. It is also useful for comparing different video stabilization algorithms and constructing a benchmark. In this paper, we present a perception-inspired video stability estimator based on 2D image motions. It calculates the fraction of information in each frame that can be perceived by human eyes. Experimental results show that our stability estimator can accurately estimate subjective video stability scores. It requires less time to compute and is more accurate and robust under different scene structures than methods based on 3D motions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.