Affective video content analysis has emerged as one of the most challenging and essential research tasks as it aims to analyze the emotions elicited by videos automatically. However, little progress has been achieved in this field due to the enigmatic nature of emotions. This widens the gap between the human affective state and the structure of the video. In this paper, we propose a novel deep affect-based movie trailer classification framework. We also develop an EmoGDB dataset, which contains 100 Bollywood movie trailers annotated with popular movie genres: Action, Comedy, Drama, Horror, Romance, Thriller, and six different types of induced emotions: Anger, Fear, Happy, Neutral, Sad, Surprise. The affect-based features are learned via ILDNet architecture trained on the EmoGDB dataset. Our work aims to analyze the relationship between the emotions elicited by the movie trailers and how they contribute in solving the multi-label genre classification problem. The proposed novel framework is validated by performing cross-dataset testing on three large scale datasets, namely LMTD-9, MMTF-14K, and ML-25M datasets. Extensive experiments show that the proposed algorithm outperforms all the state-of-the-art methods significantly, as reported by the precision, recall, F1 score, precision–recall curves (PRC), and area under the PRC evaluation metrics.
Read full abstract