Abstract

The Back Propagation Network (BPN) is one of the most widely used neural networks. It has a distinct training phase and then it is put to use. It is observed that certain training sets would get trained and certain others would not get trained in a BPN. This is an inherent property of the training set and the BPN architecture. This paper proposes a new training index that may be evaluated after a certain number of training epochs and would indicate the ability of the BPN to train for the existing topology and training set. The index is evaluated for several logic gates and the results are presented.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call