Abstract

Neural network models have been extensively used in the field of pattern recognition. Their ability to learn, their capability to self organize, and their efficient hardware implementation, make them suitable to offer solutions to several problems in signal processing and pattern recognition. However, there are still some major limitations on the capability of current neural network models, and one of them is the structure constraint. Most artificial neural network models can only change the interconnection weights while its structure remains fixed. T. C. Lee (1991) proposed a theoretical framework to explore structure level adaptability of a neural network. In this paper, a character recognition exercise is presented, with the set of printed alphabetic characters for the test. A modified structure level adaptation scheme with an accelerated back propagation algorithm used during the training stage, has been implemented with good results. Considerations on the error convergence behavior, and some other results are presented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call