Abstract

Image style transfer is a method that extracts its style from the style image and applies this style to content image. Since the introduction of neural style transfer, the field of style transfer has developed rapidly, and many new methods have been proposed. There are also some methods based on feed-forward networks, but these methods can usually only transfer one style or several styles. And these methods usually use feature gram matrix to calculate style loss. However, due to the global nature of the gram matrix, some local details cannot be stylized well, which is prone to distortion and artifacts. In this work, we propose an arbitrary style transfer method based on feed-forward network, in which the gram matrix and feature similarity are used together to calculate the style loss. The loss calculated by the similarity between features (called contextual loss in this paper) is minimized to produce stylized images with better details and fewer artifacts. Experimental results and user study prove that our method has achieved the state-of-the-art performance compared with existing arbitrary style transfer methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.