Abstract

This paper presents a method of semantic shot classification in baseball videos based on similarities of visual features. Since it is difficult to prepare a large amount of training data with annotation, accurate event detection methods constructed from a small amount of training data are needed. In broadcast baseball video, since view angles of cameras are different for each event, shot change and event change have a close relationship. When visual features from shots are similar, events corresponding to shots are also similar, and a simple distance-based approach only focusing on training data is effective. Therefore, semantic shot classification based on visual features from a small amount of training data can be realized.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call