Abstract

We present a computational model for human texture perception which assigns functional principles to the Gestalt laws of similarity and proximity. Motivated by early vision mechanisms, in the first stage, local texture features are extracted by utilizing multi-scale filtering and nonlinear spatial pooling. In the second stage, features are grouped according to the spatial feature binding model of the competitive layer model (CLM; Wersing et al. 2001). The CLM uses cooperative and competitive interactions in a recurrent network, where binding is expressed by the layer-wise coactivation of feature-representing neurons. The Gestalt law of similarity is expressed by a non-Euclidean distance measure in the abstract feature space with proximity being taken into account by a spatial component. To choose the stimulus dimensions which allow the most salient similarity-based texture segmentation, the feature similarity metrics is reduced to the directions of maximum variance. We show that our combined texture feature extraction and binding model performs segmentation in strong conformity with human perception. The examples range from classical microtextures and Brodatz textures to other classical Gestalt stimuli, which offer a new perspective on the role of texture for more abstract similarity grouping.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call