Abstract

Intelligent choreography has been a popular research topic in recent years. However, the whole music-based dance generation process is uncontrollable and non-intervenable to users. To bridge this gap, we propose CtrlDanceFr, a controllable dance generation framework that allows precise interactive adjustments of generated dance sequence through keyframe modification with style maintenance. CtrlDanceFr utilizes an innovative MoDanceRev module to modify the short-term clips according to user-input keyframes. Unlike traditional motion completion tasks, which only consider motion, our MoDanceRev also thinks about music and style, aiming to make the generated motion clip rhythmic and bring in the original sequence’s dance style. This approach significantly enhances the customization of long-term dance sequences from existing music-based dance generation methods. Furthermore, to verify the style maintenance of our framework, we design a dance classifier to evaluate the style of the generated sequences. Our rigorous experimental results, both qualitatively and quantitatively, substantiate the effectiveness and robustness of CtrlDanceFr in generating authentic and smooth dance sequences.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call