Abstract

State-of-the-art image style transfer methods have achieved impressive results by using neural networks. However, neural style transfer (NST) methods either ignore the local details of the style image by using the global statistics for style modeling or cannot fully use shallow features of neural networks, leading to the synthesized image having fewer details. In this study, we proposed a new patch-based style transfer method that directly operates in the image pixel domain without using any neural networks, achieving fascinating style transfer results with rich image details. The proposed method was derived from classic texture synthesis methods. Most previous methods rely on nearest neighbor search (NNS) for patch matching. However, this greedy strategy cannot guarantee the similarity of patch distributions between the synthesized image and the style image, which limits the expressiveness of textures. We solved this problem by proposing an optimal patch matching algorithm formed on the Optimal Transport (OT) theory, which theoretically guarantees the similarity of the patch distributions and gives a flexible style modeling method. Various qualitative and quantitative experiments demonstrated that the proposed method achieves better synthesized results than state-of-the-art style transfer methods, including NST and classic methods based on texture synthesis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call