Abstract

Human motion style is a vital concept to virtual human that has a great impact on the expressiveness of the final animation. This paper presents a novel technique that transfers style between heterogeneous motions in real time. Unlike previous approaches, our stylized motion warping is capable of reusing style between heterogeneous motions. The key idea of our work is to represent human motions with identity-independent coordinates (IICs) and learn relative space–time transformations between stylistically different IICs, instead of separating style from content. Once the relative space–time transformations are estimated from a small set of stylized example motions whose contents are identical, our technique is capable of generating style-controllable human motions by applying these transformations to heterogeneous motions with simple linear operations. Experimental results demonstrate that our technique is efficient and powerful in stylized human motion generation. Besides, our technique can also be used in numerous interactive applications, such as real-time human motion style control, stylizing motion graphs and style-based human motion editing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call