Abstract

Image style transfer technique is a popular topic in computer vision. This paper proposes a reconstructed generator model based on Cycle Generative Adversarial Networks (CycleGAN) for image style transfer tasks. Firstly, deep feature interpolation is performed to extract image high-level semantic representations in deep feature space, and then it is combined with CycleGAN to construct a new deep network model. In this model, the generator network is organized by sampling layers and residual modules. According to the above process, a new deep feature generator is designed to realize the cross-region style transfer. Experiments are carried out on several standard datasets, horse2zebr, monet2photo and summer2winter. The model is evaluated through the objective evaluation indicators of PSNR and SSIM. Experimental results show a better performance improvement compared to the classical CycleGAN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call