Abstract

Abstract Animated virtual human characters are a common feature in interactive graphical applications, such as computer and video games, online virtual worlds and simulations. Due to dynamic nature of such applications, character animation must be responsive and controllable in addition to looking as realistic and natural as possible. Though procedural and physics‐based animation provide a great amount of control over motion, they still look too unnatural to be of use in all but a few specific scenarios, which is why interactive applications nowadays still rely mainly on recorded and hand‐crafted motion clips. The challenge faced by animation system designers is to dynamically synthesize new, controllable motion by concatenating short motion segments into sequences of different actions or by parametrically blending clips that correspond to different variants of the same logical action. In this article, we provide an overview of research in the field of example‐based motion synthesis for interactive applications. We present methods for automated creation of supporting data structures for motion synthesis and describe how they can be employed at run‐time to generate motion that accurately accomplishes tasks specified by the AI or human user.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call