Abstract

MoCap (motion capture)-based animation is a hot issue in computer animation research currently. Based on the optical MoCap system, this paper proposes a novel cross-mapping based facial expression simulating method. To overcome the problem of the false upper and lower jaw correlation derived from the facial global RBF- based cross-mapping method, we construct a functional partition based RBF cross-mapping method. During model animating, enhanced markers are added and animated by our proposed skin motion mechanism. In addition, based on the enhanced markers, an improved RBF-based animating approach is raised to derive realistic facial animation. Further more, a pre-computing algorithm is presented to reduce computational cost for real-time simulation. The experiments proved that the method can not only map the MoCap data of one subject to different personalized faces but generate realistic facial animation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call