Abstract

Textures contain a wealth of image information and are widely used in various fields such as computer graphics and computer vision. With the development of machine learning, the texture synthesis and generation have been greatly improved. As a very common element in everyday life, wallpapers contain a wealth of texture information, making it difficult to annotate with a simple single label. Moreover, wallpaper designers spend significant time to create different styles of wallpaper. For this purpose, this paper proposes to describe wallpaper texture images by using multi-label semantics. Based on these labels and generative adversarial networks, we present a framework for perception driven wallpaper texture generation and style transfer. In this framework, a perceptual model is trained to recognize whether the wallpapers produced by the generator network are sufficiently realistic and have the attribute designated by given perceptual description; these multi-label semantic attributes are treated as condition variables to generate wallpaper images. The generated wallpaper images can be converted to those with well-known artist styles using CycleGAN. Finally, using the aesthetic evaluation method, the generated wallpaper images are quantitatively measured. The experimental results demonstrate that the proposed method can generate wallpaper textures conforming to human aesthetics and have artistic characteristics.

Highlights

  • Textures can be used to express the features and structural levels of an object’s surface, including rich image feature information, and are widely used in many image processing tasks

  • Wallpaper texture images contain a wealth of texture information that is difficult to annotate with a simple single label; thereby we propose a multi-label semantic mechanism for wallpaper texture using multi-label semantics

  • In order to solve the abovementioned problems, in this paper, we propose a wallpaper texture image generation model based on multi-label semantics, and we improve the wallpaper dataset with multiple styles and semantic description labels, using perception-driven texture generation technique in the view of generating wallpaper images, that meet user requirements according to the user’s perceptual descriptions

Read more

Summary

INTRODUCTION

Textures can be used to express the features and structural levels of an object’s surface, including rich image feature information, and are widely used in many image processing tasks. In order to solve the abovementioned problems, in this paper, we propose a wallpaper texture image generation model based on multi-label semantics, and we improve the wallpaper dataset with multiple styles and semantic description labels, using perception-driven texture generation technique in the view of generating wallpaper images, that meet user requirements according to the user’s perceptual descriptions. This model uses a style transfer method to convert the style of wallpaper images, and assist the artistic creation work from an artist perspective.

Semantic descriptions
Texture generation
Style Transfer
WALLPAPER TEXTURE DATASET
Wallpaper labels prediction
Perception driven wallpaper texture generation
Wallpaper texture style transfer
EXPERIMENTS
Wallpaper texture generation
Style transfer and aesthetic evaluation
ABLATION EXPERIMENTS
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call