Abstract

To provide interesting videos, it is important to generate relevant tags and annotations that describe the whole video or its segment efficiently. Because generating annotations and tags is a time-consuming process, it is essential for analyzing videos without human intervention. Although there have been many studies of implicit human-centered tagging using bio-signals, most of them focus on affective tagging and tag relevance assessment. This paper proposes binary and unary classification models that recognize actions meaningful to users in videos, for example jumps in the figure skating program, using EEG features of band power (BP) values and asymmetry scores (AS). As a result, the binary and binary classification models achieved the best balanced accuracies of 52.86% and 50.06% respectively. The binary classification models showed high specificity on non-jump actions and the unary classification models showed high sensitivity on jump actions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call