Abstract

Background: With the rise of user-generated content (UGC) platforms, we are witnessing an unprecedented surge in data. Among various content types, dance videos have emerged as a potent medium for artistic and emotional expression in the Web 2.0 era. Such videos have increasingly become a significant means for users to captivate audiences and amplify their online influence. Given this, predicting the popularity of dance videos on UGC platforms has drawn significant attention. Methods: This study postulates that body movement features play a pivotal role in determining the future popularity of dance videos. To test this hypothesis, we design a robust prediction framework DanceTrend to integrate the body movement features with color space information for dance popularity prediction. We utilize the jazz dance videos from the comprehensive AIST++ street dance dataset and segment each dance routine video into individual movements. AlphaPose was chosen as the human posture detection algorithm to help us obtain human motion features from the videos. Then, the ST-GCN (Spatial Temporal Graph Convolutional Network) is harnessed to train the movement classification models. These pre-trained ST-GCN models are applied to extract body movement features from our curated Bilibili dance video dataset. Alongside these body movement features, we integrate color space attributes and user metadata for the final dance popularity prediction task. Results: The experimental results endorse our initial hypothesis that the body movement features significantly influence the future popularity of dance videos. A comprehensive evaluation of various feature fusion strategies and diverse classifiers discern that a pre–post fusion hybrid strategy coupled with the XGBoost classifier yields the most optimal outcomes for our dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call