Abstract

Existing image translation methods already enable style transfer on unpaired data. Although these methods have yielded satisfactory results, they still result in changing the background while changing the object. One reason is that when using convolutional neural networks, global information is lost as the number of network layers increases, and the absence of an effective sensory field leads to the failure to generate high-quality results. This paper proposed a Non-Local-Attention-Cycle-Consistent Adversarial Networks for unpaired images style transfer. The no-local-attention can quickly capture long-range dependencies, better extracts global information, ensures effective focus on the foreground while preserving the background, and can be easily embedded into the current network architecture. Experiments are conducted on neural style transfer task with public dataset, this model can obtain the better result than CycleGAN. It allows better attention to structural features rather than just textural features. It can reconstruct some of the content lost by CycleGAN. Recent research has also demonstrated that the optimizer has an impact on the performance of the network. This paper applies the Nadam optimizer and find that this improves training process.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call