Abstract

In this paper we propose a new approach to find the optimum learning rate that increases the recognition rate and reduces the training time of the back propagation neural network as well as single layer feed forward Neural Network. We give a comparative analysis of performance of back propagation neural network and single layer feed forward neural network. In our approach we use variable learning rate and demonstrate its superiority over constant learning rate. We use different inner epochs for different input patterns according to their difficulty of recognition. We also show the effect of optimum numbers of inner epochs, best variable learning rate and numbers of hidden neurons on training time and recognition accuracy. We run our algorithm for face recognition application using Principal Component Analysis and neural network and demonstrate the effect of number of hidden neurons and size of feature vector on training time and recognition accuracy for given numbers of input patterns. We use ORL database for all the experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call