Abstract

To solve the issue of texture preservation in the image style transfer process, this paper presents a novel style transfer method for images that often contain tiny details but are easily noticed by human subjects (e.g., human faces). We aim to achieve content-preserving style transfer via an appropriate trade-off between detail preservation and style transfer. To this end, we utilize wavelet transformation with a deep neural network for decoupled style and detail synthesis. Additionally, style transfer should involve a one-to-one correspondence of semantic structures of scenes and avoid noticeable unnatural-looking style transitions around them. To address the above issue, we leverage an attention mechanism and semantic segmentation for matching and design a novel content loss with local one-to-one correspondence for producing content-preserving stylized results. Finally, we employ wavelet transform to perform feature optimization (FO) to repair some imperfect results. We perform various experiments with Qabf evaluation and a user study to validate our proposed method and show its superiority over state-of-the-art methods for ensemble and texture preservation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call