Abstract

Temporal segmentation of a video sequence into different shots is fundamental to a number of video retrieval and analysis applications. Motion estimation has been wildly used in many applications of video processing, since it provides the most essential information for an image sequence. In this paper, we explore the possibility to exploit motion and illumination estimation in a video sequence to detect various types of shot changes. Optical flow is the motion vector computed at each pixel in an image sequence from intensity variation. Traditionally, optical flow computational algorithms were derived from the brightness constancy assumption. In this paper, we employ a generalized optical flow constraint that includes an illumination parameter to model local illumination changes. An iterative optical flow and illumination estimation algorithm is developed in this paper to refine the generalized optical flow constraints step by step, thus leading to a very accurate estimation of the optical flow and illumination parameters. Two robust measures are defined from the mean and standard deviation of the estimated intensity compensation values for all the blocks in the same image. Either of these two measures corresponds significantly to various types of shot changes. We show the usefulness of these two measures through experiments.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.