Abstract

The field of Neural Style Transfer (NST) has led to interesting applications that enables to transform the reality as human beings perceive. Particularly, NST for material translation aims to change the material (texture) of an object to a different material from a desired image. In order to generate more realistic results, in this paper, we propose a partial texture style transfer method by combining NST with semantic segmentation. The original NST algorithm changes the style of an entire image including the style of background even though the texture is contained only in object regions. Therefore, we segment target objects using a weakly supervised segmentation method, and transfer the material of the style image to only material-based segmented areas. As a result, we achieved partial style transfer for only specific object regions, which enables us to change materials of objects in a given image as we like. Furthermore, we analyze the material translation capability of state-of-the-art image-to-image (I2I) translation algorithms, including the conventional NST method of Gatys, WCT, StarGAN, MUNIT, and DRIT++. The analysis of our experimental results suggests that the conventional NST produces more realistic results than other I2I translation methods. Moreover, there are certain materials that are easier to synthesize than others.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.