Abstract

For an input image, the image style transfer is to change the style of the input image accordingly without modifying its original content. Our empirical studies discover that, when the existing style transfer techniques are applied to headshot portraits, the noticeable distortions could be incurred due to the fact that the relative location of those facial components, such as lips, noses, and eyes, are sensitive to spatial differences. Based on the DPST (deep photo style transfer) method, a spatially-robust image style transfer algorithm for headshot portraits is proposed in this paper to improve the robustness of image style transfer. The proposed algorithm provides an improved solution for the problem that the transferred content could incur ghost shadows when segmented regions between the input image and the style image are significantly different. By maximizing the normalized cross-correlation of the channels of feature maps of corresponding segmented regions of the input image and the style image in a pre-trained convolutional neural network, we propose to apply affine transformations to the style image before it is used for the style transfer. Each segmented region takes the corresponding spatially transformed style image as style reference, making it adaptive to the spatial difference between the input and the style image and the incurred ghost shadow can be minimized or eliminated. The experimental results show that our proposed algorithm can obtain good results even when the semantic segmentation between two images have the large spatial differences. The new algorithm also displays the better robustness than the DPST algorithm and other benchmark.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.