Abstract

Arbitrary neural style transfer aims to render a content image in a randomly given artistic style using the features extracted from a well-trained convolutional neural network. Existing style transfer algorithms have demonstrated astonishing results. However, the generated images suffer from loss of content details, non-uniform stroke patterns, and limited diversity. In this article, we focus on improving the diversity of the stylized images. We propose a light-weighted yet efficient method named style permutation (SP) to tackle the limitation of the diversity without harming the original style information. The core of our style permutation algorithm is to multiply the deep image feature maps by a permutation matrix. Compared with state-of-the-art diversified style transfer methods, our style permutation algorithm offers more flexibility. Also, we present qualitative and quantitative analysis and theory explanations of the effectiveness of our proposed method. Experimental results show that our proposed method could generate diverse outputs for arbitrary styles when integrated into both WCT (whitening and coloring transform)-based methods and AdaIN (adaptive instance normalization)-based methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.