Abstract

Choreography is usually done by professional choreographers, while the development of motion capture technology and artificial intelligence has made it possible for computers to choreograph with music. There are two main challenges in choreography: 1) how to get real and novel dance moves without relying on motion capture and manual production, and 2) how to use the appropriate music and motion features and matching algorithms to enhance the synchronization of music and dance. Focusing on these two targets above, we propose a framework based on Mixture Density Network (MDN) to synthesis dances that match the target music. The framework includes three steps: motion generation, motion screening and feature matching. In order to make the dance movements generated by the model applicable for choreography with music, we propose a parameter control algorithm and a coherence-based motion screening algorithm to improve the consistency of dance movements. Moreover, to achieve better unity of music and motions, we propose a multi-level music and motion feature matching algorithm, which combines global feature matching with local feature matching. Finally, our framework proved to be able to synthesis more coherent and creative choreography with music.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call