Abstract

This paper presents a method of semantic shot classification in baseball videos based on similarities of visual features. Since it is difficult to prepare a large amount of training data with annotation, accurate event detection methods constructed from a small amount of training data are needed. In broadcast baseball video, since view angles of cameras are different for each event, shot change and event change have a close relationship. When visual features from shots are similar, events corresponding to shots are also similar, and a simple distance-based approach only focusing on training data is effective. Therefore, semantic shot classification based on visual features from a small amount of training data can be realized.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.