Abstract

Artistic portrait drawing (APDrawing) generation has seen progress in recent years. However, due to the naturally high scarcity and artistry, it is difficult to collect large-scale labeled and paired data and generally divide drawing styles into several specific recognized categories. Existing works suffer from the limited labeled data and naive manual division of drawing styles according to the corresponding artists. They cannot adapt to the actual situations, for example, a single artist might have multiple drawing styles and APDrawings from different artists might share similar styles. In this paper, we propose to use unlabeled and unpaired data and perform the task in an unsupervised manner. Without manual division of drawing styles, we take each portrait drawing as a unique style and introduce self-supervised feature learning to learn free styles for unlabeled portrait drawings. Besides, we devise a style bank and a decoupled cycle structure to take over two main considerations in the task: generation quality and style control. Extensive experiments show that our model is more adaptable to different style inputs than state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call