Abstract

Texture exemplar has been widely used in synthesizing 3D movie scenes and appearances of virtual objects. Unfortunately, conventional texture synthesis methods usually only emphasized on generating optimal target textures with arbitrary sizes or diverse effects, and put little attention to automatic texture exemplar extraction. Obtaining texture exemplars is still a labor intensive task, which usually requires carefully cropping and post-processing. In this paper, we present an automatic texture exemplar extraction based on Trimmed Texture Convolutional Neural Network (Trimmed T-CNN). Specifically, our Trimmed T-CNN is filter banks for texture exemplar classification and recognition. Our Trimmed T-CNN is learned with a standard ideal exemplar dataset containing thousands of desired texture exemplars, which were collected and cropped by our invited artists. To efficiently identify the exemplar candidates from an input image, we employ a selective search algorithm to extract the potential texture exemplar patches. We then put all candidates into our Trimmed T-CNN for learning ideal texture exemplars based on our filter banks. Finally, optimal texture exemplars are identified with a scoring and ranking scheme. Our method is evaluated with various kinds of textures and user studies. Comparisons with different feature-based methods and different deep CNN architectures (AlexNet, VGG-M, Deep-TEN and FV-CNN) are also conducted to demonstrate its effectiveness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call