Abstract

Garment transfer can wear the garment of the model image onto the personal image. As garment transfer leverages wild and cheap garment input, it has attracted tremendous attention in the community and has a huge commercial potential. Since the ground truth of garment transfer is almost unavailable in reality, previous studies have treated garment transfer as either pose transfer or garment-pose disentanglement, and trained garment transfer in self-supervised learning, However, these implementation methods do not cover garment transfer intentions completely and face the robustness issue in the testing phase. Notably, virtual try-on technology has exhibited superior performance using self-supervised learning, we propose to supervise the garment transfer training via knowledge distillation from virtual try-on. Specifically, the overall pipeline is first to infer a garment transfer parsing, and to use it to guide downstream warping and inpainting tasks. The transfer parsing reasoning model learns the response and feature knowledge from the try-on parsing reasoning model and absorbs the hard knowledge from the ground truth. The progressive flow warping model learns the content knowledge from virtual try-on for a reasonable and precise garment warping. To enhance transfer realism, we propose an arm regrowth task to infer exposed skin. Experiments demonstrate that our method has state-of-the-art performance in transferring garments between persons compared with other virtual try-on and garment transfer methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call