Abstract

The objective of this paper is to develop an effective photographic transfer method while preserving the semantic correspondence between the style and content images. A semantic correspondence guided deep photo style transfer algorithm is developed, which is to ensure that the semantic structure of the content image has not been changed while the color of the style images is being migrated. The semantic correspondence is constructed in large scale regions based on image segmentation and also in local scale patches using deep image analogy. Based on the semantic correspondence, a matting optimization is utilized to optimize the style transfer result to ensure the semantic accuracy and transfer faithfulness. The proposed style transfer method is further extended to automatically retrieve the style images from a database to make style transfer more-friendly. The experimental results show that our method could successfully conduct the style transfer while preserving semantic correspondence between diversity of scenes. A user study also shows that our method outperforms state-of-the-art photographic style transfer methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call