Abstract
In this paper, we introduced a modified version of backpropagation algorithm which is able to converge faster than the traditional backpropagation for multilayer perceptron neural networks. We also proposed a dynamic technique for specifying the number of hidden layers and number of neuron in each hidden layer along with an initialization technique to set start values of the synaptic weights. A novel two activation functions have been proposed, which are proved experimentally that they accelerate convergence of neural networks while maintaining the accuracy of the trained network. Experimental results showed that the average number of epochs required to reach approximation using exponential hyperbolic tangent is less than the half of average number using traditional sigmoid function. Finally, a case study is performed on converting alphanumeric character bit-mapped pixel image to its ASCII equivalent code.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Computers and Applications
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.