Abstract

This paper presents a novel data-driven method for creating varied realistic face models by synthesizing a set of facial features according to intuitive high-level control parameters. Our method takes as examples 3D face scans in order to exploit the variations presented in the real faces of individuals. We use an automatic model fitting approach for the 3D registration problem. Once we have a common surface representation for each example, we form feature shape spaces by applying principal component analysis (PCA) to the data sets of facial feature shapes. Using PCA coefficients as a compact shape representation, we approach the shape synthesis problem by forming scattered data interpolation functions that are devoted to the generation of desired shape by taking the anthropometric parameters as input. The correspondence among all exemplar textures is obtained by parameterizing a 3D generic mesh over a 2D image domain. The new feature texture with desired attributes is synthesized by interpolating the example textures. Apart from an initial tuning of feature point positions and assignment of texture attribute values, our method is fully automated.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.