Abstract

The extended associative memory (AM) neural network (EAMNN) has the advantage of performing the classification in noisy environments. We propose a faster robust learning algorithm of EAMNN and a new error cost function based on weighted sum of standard output error and Hamming distance of output error, and the additional derivatives term of first hidden layer neural activation functions. The fast backpropagation training is based on a modified steepest descent method derived by changing the error function to update weights according to output error, thus it speeds up significantly training speed of the MLP and BAM. The algorithm can force the hidden-layer activation to be saturated to reduce sensitivity of the output values to input variables effectively. It improves robustness on classification performance, increases associative memory ability and accelerates training speed of EAMNN. The experiments verify that it is more powerful than other networks. Then we proposed a two level tree structure modular EAMNN for large-set pattern classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call