Abstract

The authors describe a learning strategy motivated by computational constraints that enhances the speed of neural network learning. Decision regions in feature space are of three types: (1) well separated clusters (Type A). (2) disconnected clusters (Type B) and (3) clusters separated by complex boundaries (Type C). These decision regions have psychological validity, as is evident from E. Rosch's (1976) categorization theory. Rosch suggests that in taxonomies of real objects, there is one level of abstraction at which basic category cuts are made. Basic categories are similar to Type A clusters. Categories one level more abstract than basic categories are superordinate categories and categories one level less abstract are subordinate categories. These correspond to Type B and Type C clusters, respectively. It is proved that, in a binary valued feature space, basic categories can be learned by a perceptron. A two-layer network for classifying basic categories in a multi-valued feature space is described. This network is used as a basis to construct neural network STRUCT for learning superordinate and subordinate categories. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call