Abstract
Human parametric models can provide useful constraints for human shape estimation to produce more accurate results. However, the state-of-art models are computational expensive which limit their wide use in interactive graphics applications. We present PROME (PROjected MEasures) - a novel human parametric model which has high expressive power and low computational complexity. Projected measures are sets of 2D contour poly-lines that capture key measure features defined in anthropometry. The PROME model builds the relationship between 3D shape and pose parameters and 2D projected measures. We train the PROME model in two parts: the shape model formulates deformations of projected measures caused by shape variation, and the pose model formulates deformations of projected measures caused by pose variation. Based on the PROME model we further propose a fast shape estimation method which estimates the 3D shape parameters of a subject from a single image in nearly real-time. The method builds an optimize problem and solves it using gradient optimizing strategy. Experiment results show that the PROME model has well capability in representing human body in different shape and pose comparing to existing 3D human parametric models, such as SCAPE[Anguelov et al. 2005] and TenBo[Chen et al. 2013], yet keeps much lower computational complexity. Our shape estimation method can process an image in about one second, orders of magnitude faster than state-of-art methods, and the estimating result is very close to the ground truth. The proposed method can be widely used in interactive applications such as virtual try-on and virtual reality collaboration.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.