Recent research in arbitrary style transfer has highlighted challenges in maintaining the balance between content structure and style patterns. Moreover, the improper application of style patterns onto the content image often results in suboptimal quality. In this paper, a novel style transfer network, called MCNet, is proposed. It is based on multi-feature correlations. To better explore the intrinsic relationship between the style image and the content image and to transfer the most suitable style onto the content image, a novel Global Style-Attentional Transfer Module, named GSATM, is introduced in this work. GSATM comprises two parts: Forward Adaptive Style Transformation (FAST) and Delayed Style Transformation (DST). The former analyzes the relationship between style and content features and fine-tunes the style features, whereas the latter transfers the content features based on the fine-tuned style features. Moreover, a new encoding and decoding structure is designed to effectively handle the output of GSATM. Extensive quantitative and qualitative experiments fully demonstrate the superiority of our algorithm. Project page: https://github.com/XiangJinCherry/MCNet.
Read full abstract