Abstract
AbstractMultimedia content such as videos and games often includes animation, in which virtual actors engage in conversations. In constructing such an animation, there are many factors to be considered, such as the content of the utterance and the state of conversation, which necessitates a large amount of time and labor. To deal with this problem, this paper presents a method by which the head and eye movements of the virtual actor can be generated in a simple way, synchronized with the conversation. In this method, the view line angle is defined as the sum of the head and eye rotations. A shared motion mechanism is considered in which the ratio of the head and eye rotations is dynamically adjusted according to the view line direction, so that a composite movement of the head and eyes is generated. Letting the two modules generating the head and the eye movements share the same conversation state, head and eye movements synchronized with the conversation are generated. Finally, the proposed method is applied, and the animation of a virtual actor synchronized with the conversation is demonstrated. © 2006 Wiley Periodicals, Inc. Syst Comp Jpn, 37(12): 33–44, 2006; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/scj.20513
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have