Abstract

In this paper, we present scenario based dynamic video abstractions using graph matching. Our approach has two main components: multi-level scenario generations and dynamic video abstractions. Multi-level scenarios are generated by a graph-based video segmentation and a hierarchy of the segments. Dynamic video abstractions are accomplished by accessing the generated hierarchy level by level. The first step in the proposed approach is to segment a video into shots using Region Adjacency Graph (RAG). A RAG expresses spatial relationships among segmented regions of a frame. To measure the similarity between two consecutive RAGs, we propose a new similarity measure, called Graph Similarity Measure (GSM). Next, we construct a tree structure called scene tree based on the correlation between the detected shots. The correlation is computed by the GSM since it considers the relations between the detected shots properly. Multi-level scenarios which provide various levels of video abstractions are generated using the constructed scene tree. We provide two types of abstraction using multi-level scenarios: multi-level highlights and multi-length summarizations. Multi-level highlights are made by entire shots in each scenario level. To summarize a video in various lengths, we select key frames by considering temporal relationships among RAGs computed by the GSM. We have developed a system, called Automatic Video Analysis System (AVAS), by integrating the proposed techniques to show their effectiveness. The experimental results show that the proposed techniques are promising.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.