Abstract

AbstractWe aim to generate facial expressions of various persons more naturally and without any markers and feature lines. We propose a new approach to estimating muscular contraction parameters for the purpose of generating facial expressions. The muscles of the face are commonly known as the muscles of facial expression. The facial expressions of a human are generated by muscle movements inside the face. If the muscle‐based facial model that Waters proposed is used, facial expressions can be generated by muscular contraction parameters. In this paper, first, the facial surface feature points of the face images are detected by image‐processing methods. Next, the muscular contraction parameters are estimated by the neutral expression and arbitrary expression displacement of the facial model wireframe fitting, based on detected facial surface feature points. Finally, the facial expression is generated by the vertex displacements of an individual facial model based on estimated muscular contraction parameters. Experimental results reveal that our approach can generate various facial expressions of the individual facial model, which corresponds to the facial expression of the actual face image. Additionally, we can generate the facial expressions of another person by using the muscular contraction parameters of that person in an individual facial model. © 2007 Wiley Periodicals, Inc. Syst Comp Jpn, 38(12): 66–75, 2007; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/scj.20647

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call