Abstract

Dance performance recognition methods have been investigated and shown various applications such as picture-pose evaluation and synchronizing foot timing and direction. However, detailed analysis and feedback are still missing. To provide them, understanding the performance by component level is necessary. Specifically, we formulate it as a dance-figure classification problem using three-dimensional body joints and wearable sensors. Our model is based on long short-term memory (LSTM) and includes the temporal and trajectory-wise structure that uses the trajectory information in a timestep and the temporal masking module. As a result, we achieved 93% accuracy with our proposed method, which is highly overwhelming the baseline result (84.7%) and very close to the accuracy of the experienced dancers (93.6%). We have made the dataset of ballroom dance performance dataset open to researchers to develop the activity recognition field further.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call