Abstract

Video stability is a significant problem for videos captured by handheld or body-worn cameras. An accurate video stability estimation that is consistent with human perception is the basis of effective video stabilization algorithms. It is also useful for comparing different video stabilization algorithms and constructing a benchmark. In this paper, we present a perception-inspired video stability estimator based on 2D image motions. It calculates the fraction of information in each frame that can be perceived by human eyes. Experimental results show that our stability estimator can accurately estimate subjective video stability scores. It requires less time to compute and is more accurate and robust under different scene structures than methods based on 3D motions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call