Abstract

AbstractThis paper presents a learning‐based clothing animation method for highly efficient virtual try‐on simulation. Given a garment, we preprocess a rich database of physically‐based dressed character simulations, for multiple body shapes and animations. Then, using this database, we train a learning‐based model of cloth drape and wrinkles, as a function of body shape and dynamics. We propose a model that separates global garment fit, due to body shape, from local garment wrinkles, due to both pose dynamics and body shape. We use a recurrent neural network to regress garment wrinkles, and we achieve highly plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods. At runtime, dynamic virtual try‐on animations are produced in just a few milliseconds for garments with thousands of triangles. We show qualitative and quantitative analysis of results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call