Abstract
Virtual garment animation and deformation constitute a pivotal research direction in computer graphics, finding extensive applications in domains such as computer games, animation, and film. Traditional physics-based methods can simulate the physical characteristics of garments, such as elasticity and gravity, to generate realistic deformation effects. However, the computational complexity of such methods hinders real-time animation generation. Data-driven approaches, on the other hand, learn from existing garment deformation data, enabling rapid animation generation. Nevertheless, animations produced using this approach often lack realism, struggling to capture subtle variations in garment behavior. We proposes an approach that balances realism and speed, by considering both spatial and temporal dimensions, we leverage real-world videos to capture human motion and garment deformation, thereby producing more realistic animation effects. We address the complexity of spatiotemporal attention by aligning input features and calculating spatiotemporal attention at each spatial position in a batch-wise manner. For garment deformation, garment segmentation techniques are employed to extract garment templates from videos. Subsequently, leveraging our designed Transformer-based temporal framework, we capture the correlation between garment deformation and human body shape features, as well as frame-level dependencies. Furthermore, we utilize a feature fusion strategy to merge shape and motion features, addressing penetration issues between clothing and the human body through post-processing, thus generating collision-free garment deformation sequences. Qualitative and quantitative experiments demonstrate the superiority of our approach over existing methods, efficiently producing temporally coherent and realistic dynamic garment deformations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.