Abstract
A general adaptive model unifying existing models for pattern learning is proposed. The model, in addition to preserving the merits of geometric and syntactic approaches to pattern recognition, has decisive advantages over them. It can be viewed as a far-reaching generalization of the perceptron, or neural net, models, in which the vector representation and the associated vector operations are replaced by more general structural representation and the corresponding structural operations. The basis of the model is the concept of a transformation system, which is a generalization of Thue (Post-production) systems. Parametric distance functions in transformation systems are introduced. These are generalizations of weighted Levenshtein (edit) distances to more general structured objects. Learning model for transformation systems, unifying many existing models (including that of neural nets), is proposed. The model also suggests how various propositional object (class) descriptions might be generated based on the outputs of the learning processes: these descriptions represent “translation” of some information encoded in the nonpropositional “language” of the corresponding transformation system, representing the environment, into the chosen logical (propositional) language, whose semantics is now defined by the “translation”. In the light of the metric model the intelligence emerges as based on simple arithmetic processes: first, those related to the optimal distance computation, and, second, “counting” and comparing the results of counting for various “important” features, detected at the learning stage (arithmetic predicates).
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.