Abstract

Arbitrary style transfer means that stylized images can be generated from a set of arbitrary input image pairs of content images and style images. Recent arbitrary style transfer algorithms lead to distortion of content or incompletion of style transfer because network need to make a balance between the content structure and style. In this paper, we introduce a dual attention network based on style attention and channel attention, which can flexibly transfer local styles, pay more attention to content structure, keep content structure intact and reduce unnecessary style transfer. Experimental results show that the network can synthesize high quality stylized images while maintaining real-time performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call