Abstract

Muscle movement inside the face generates the real facial expressions of a human. This paper proposes a method that generates facial expressions based on estimating muscle movement, i.e.. muscular contraction parameters using detected facial feature points. First, the facial feature points of a facial image are detected using image processing methods. Then, the displacements of facial feature points, which correspond to the muscle-based facial model that Waters proposed, are calculated. Finally, we estimate the muscular contraction parameters of a facial model to obtain the vertex displacements. Experimental results reveal that our approach can generate a facial expression of the facial model, which corresponds to the facial expression of the actual facial image

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.