Abstract

This paper compares the performance of Gradient descent with momentum & adaptive backpropagation (TRAINGDX) and BFGS quasi-Newton backpropagation (TRAINBFG) of Backpropagation Algorithm in multilayer feed forward Neural Network for Handwritten English Characters of Vowels. This analysis is done with five samples of Handwritten English Characters of Vowels collected from five different people and stored as an image. After partition these scanned image into 4 portions, the densities of these images are determined by using MATLAB function. An input pattern will use these 4 densities of each character as an input for the two different Neural Network architectures. In our proposed work the Multilayer feed forward neural networks will train with two learning algorithms; those are the variant of Backpropagation learning algorithm namely Quasi-Newton backpropagation learning algorithm and Gradient descent with momentum and adaptive backpropagation learning algorithm for training set of the Handwritten English Characters of Vowels. The performance analysis of both Neural Network architectures is done for convergence and nonconvergence. Different observations have been considered for trends of error in the case of nonconvergence. From the observation of the result, it can be shown that in the performance of these above two learning algorithms with the training set of handwritten characters of Vowels, there is limitation of gradient descent learning algorithm for convergence due to the problem of local minima which is inherit problem of backpropagation learning algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call