Effective texture classification requires image descriptors capable of efficiently detecting, extracting, and describing the most relevant information in the images, so that, for instance, different texture classes can be distinguished despite image distortions such as varying illuminations, viewpoints, scales, and rotations. Designing such an image descriptor is a challenging task that typically involves the intervention of human experts. In this paper, a general method to automatically design effective image descriptors is proposed. Our method is based on grammatical evolution and, using a set of example images from a texture classification problem and a classification algorithm as inputs, generates problem-adapted image descriptors that achieve very competitive classification results. Our method is tested on five well-known texture data sets with different number of classes and image distortions to prove its effectiveness and robustness. Our classification results are statistically compared against those obtained by means of six popular hand-crafted texture descriptors in the state of the art. This statistical analysis shows that our evolutionarily designed descriptors outperform most of those designed by human experts.
Read full abstract