Abstract

In this paper we propose a new approach to find the optimum learning rate that increases the recognition rate and reduces the training time of the back propagation neural network as well as single layer feed forward Neural Network. We give a comparative analysis of performance of back propagation neural network and single layer feed forward neural network. In our approach we use variable learning rate and demonstrate its superiority over constant learning rate. We use different inner epochs for different input patterns according to their difficulty of recognition. We also show the effect of optimum numbers of inner epochs, best variable learning rate and numbers of hidden neurons on training time and recognition accuracy. We run our algorithm for face recognition application using Principal Component Analysis and neural network and demonstrate the effect of number of hidden neurons and size of feature vector on training time and recognition accuracy for given numbers of input patterns. We use ORL database for all the experiments.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.