Abstract

Traditional garment animation workflow relies on the professional clothing simulator, which requires manual editing of artists or animators. There is no doubt that such a process is time-consuming and laborious. Synthesizing garment dynamics according to the input high-level parameters in a semi-automatic way not only helps dismiss the domain gap between inspiration and technical implementation, but also enables artists to focus on the authoring of animating contents. To that end, a variational auto-encoder-based garment animation synthesis method is presented. Firstly, a set of motion sequences composed of different poses are sampled to generate the human body dataset. Secondly, a variational auto-encoder network is constructed to learn the probabilistic distribution of clothing deformation from garment motions under different pose variations. Besides, a mesh Laplacian term on the loss function is introduced to preserve wrinkle details of the synthesized garments. After that, constraints on the latent space are imposed to control the garment shape to be generated. Finally, a refinement process is employed to resolve the penetration between the body surface and garment mesh, obtaining realistic clothing deformations. Proposed method is qualitatively and quantitatively evaluated on the AMASS dataset from different aspects: body motion/shape-driven garment synthesis, garment animation authoring. The experimental results demonstrate that proposed workflow is able to produce visually realistic garments without noticeable artifacts. Proposed method can produce temporally-consistent garment dynamics with shape and pose variations, which assists artists in authoring the desired clothing deformations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.