Abstract

Facial Animation is a serious and ongoing challenge for the Computer Graphic industry. Because diverse and complex emotions need to be expressed by different facial deformation and animation, copying facial deformations from existing character to another is widely needed in both industry and academia, to reduce time-consuming and repetitive manual work of modeling to create the 3D shape sequences for every new character. But transfer of realistic facial animations between two 3D models is limited and inconvenient for general use. Modern deformation transfer methods require correspondences mapping, in most cases, which are tedious to get. In this paper, we present a fast and automatic approach to transfer the deformations of the facial mesh models by obtaining the 3D point-wise correspondences in the automatic manner. The key idea is that we could estimate the correspondences with different facial meshes using the robust facial landmark detection method by projecting the 3D model to the 2D image. Experiments show that without any manual labelling efforts, our method detects reliable correspondences faster and simpler compared with the state-of-the-art automatic deformation transfer method on the facial models.

Highlights

  • IntroductionEspecially character facial modeling, plays a pivotal role in computer graphics and computer animation

  • Character modeling, especially character facial modeling, plays a pivotal role in computer graphics and computer animation

  • Copying facial deformations from one character to another is widely needed in both industry and academia, to replace time consuming modeling work to create the shape sequences for every new character

Read more

Summary

Introduction

Especially character facial modeling, plays a pivotal role in computer graphics and computer animation. The main limitation of current deformation transfer methods is the dependence on tedious work to get the reliable correspondence map between the source character and the new target character. The research on automatically detecting the correspondences between source and target characters, employing the skin deformation method [1] to fully automatically obtain high-quality deformation transfer results becomes very important in computer modeling and animation to reuse the existing dataset for generating new shapes and reduce tedious time cost of artists. To tackle the above problem for transferring subtle detailed facial deformations, in this paper, we propose one straight forward approach to automatically obtain the 3d point-wise correspondences on source and target faces without any manual work. Aimed at facial deformation transfer, experiments show that our method fully automatically creates high-quality correspondences between source and target faces, is able to obtain believable deformations transferred much faster, and is simpler than the state-of-the-art automatic deformation transfer method presented in [2]

Related Work
Our Approach
Convert 3D Mesh to 2D Space
Detect and Refine the 2D Facial Landmarks
Retrieve 3D Point-Wise Correspondences
Deformation Transfer
Experiments and Comparisons
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call