Abstract

Abstract A novel algorithm is proposed in this paper, which builds and then shrinks a three-layer feed-forward neural network to achieve arbitrary classification in the n-dimensional Euclidean space.The algorithm offers guaranteed convergence and a 100% correct classification rate on trainingpatterns, as well as an explicit generalisation rule for predicting how a trained network generalises to patterns that did not appear in training. Moreover, this generalisation rule is continuously adjustable from an equal-angle measure to an equal-distance measure via a single reference numberto allow adaptation of performance for different requirements. 1 Introduction Neural networks used as classifiers have recently been studied intensively. Trained networks are able to classify patterns that appeared at the training stage (training patterns) as well as to generalise what it has learned, namely, to classify patterns that did not appear in training (these patterns shallbe referred to as new patterns). The well-known back-propagation algorithm [1] can be used to traina multi-layer feed-forward network to classify Booleaii patterns (vectors of Boolean components) orreal patterns (vectors in the n-dimensional Euclidean space) but it is not guaranteed to converge tothe global optimum solution. Also, there is no explicit generalisation rule to predict how a networktrained by back-propagation generalises to new patterns. One often has to use experience to choosesome parameters such as network size to achieve a desirable generalisation.Recently, Mezard et al. [2] proposed a tiling algorithm for building a multi-layer feed-forwardnetwork to classify Boolean patterns with guaranteed convergence. Zoilner et al. [3] offered anotheralgorithm for building a three-layer feed-forward network to classify Boolean patterns with guaranteedconvergence. Both studies conclude with the possibility of extending their methods to real vectors,though no further details are presented. in their studies, generalisation is emphasized and demon-strated by simulations, as with the back propagation, though no explicit generalisation rule is given.In this paper, we propose a novel generating-shrinking algorithm which builds and then shrinks athree-layer feed-forward neural network to classify arbitrary patterns in the n-dimensional Euclideanspace

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.