Abstract

Procedural textures have been widely used as they can be easily generated from various mathematical models. However, the model parameters are not perceptually meaningful or uniform for non-expert users. In this paper, we proposed a system that can generate procedural textures interactively along certain perceptual dimensions. We built a procedural texture dataset and measured twelve perceptual properties of a small subset through psychophysical experiments. The perceived magnitude of the rest textures was estimated by Support Vector Machines using computational features from a cascaded PCA network. For a given texture displayed on a touch screen, the user makes finger gestures which were then transferred to magnitude changes in perceptual space. The texture in the database that matches the new perceptual scale and with nearest distance in computational feature space will be chosen and displayed. We reported our experiment results for two particular perceptual properties: surface roughness and directionality. Other properties can be manipulated similarly.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call