Abstract

A new artificial neural model for unsupervised learning is proposed. Consider first a two-class pattern recognition problem. We use one neuron (possibly higher order) with a sigmoid in the range from −1 to 1. Positive output means class 1 and negative output means class 2. The main idea of the method is that it iterates the weights in such a way as to move the decision boundary to a place of low pattern density. Constraining the length of the weight vector, if the neuron output is mostly near 1 or −1, then this means that the patterns are mostly far away from the decision boundary and we have probably a good classifier. We define a function which measures how close the output is to 1 or −1. Training is performed by a steepest-ascent algorithm on the weights. The method is extended to the multiclass case by applying the previous procedure in a hierarchical manner (i.e., by partitioning the patterns into two groups, then considering each group separately and partitioning it further and so on until we end up with the final classifier).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call