Abstract
This paper presents a method for applying inductive learning techniques to texture description and recognition. Local features of texture are computed by two well-known methods, Laws’ masks and co-occurrence matrices. Then, a three-level generalization of local features is applied to create texture description rules. The first level generalization, the scaling interface, has been implemented to transform the numeric data of local texture features into their higher symbolic representation as numerical ranges. This scaling interface tests data consistency as well. The creation of description rules incorporating the inductive incremental learning algorithm is the second generalization step. The SG-TRUNC method of rule reduction is applied as the next hierarchical generalization level. This machine learning approach to texture description and recognition is compared with the classic pattern recognition methodology. The results from the recognition phase are presented from six classes of textures, characterized by smoothly changing illumination and/or texture resolution. The average recognition rate was 91% for the inductive learning approach, and all classes of textures were recognized. In comparison, the traditional k-NN pattern recognition method did not recognize one class of texture, and the average recognition rate was 83%. The proposed methodology smooths the recognition rates through the hierarchy of generalization levels, i.e. the next generalization step increases these rates for classes that were less easily recognized, and decreases these rates for classes that were more easily recognized.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Pattern Recognition and Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.