Abstract

It is well known that neural networks (NN) with backpropagation (BP) are used for recognition and learning. The basic networks have three layers, input layer, one hidden layer and output layer, and the scale a of 3-layered NN depends on the number of hidden layer units (fixed number of input and output layer units on NN). In this paper the authors make a multi (4,5)-layered NN with four or five layers on BP (input layer, two or three hidden layers, output layer) and try to compare a 3-layered NN and a multi-layered NN, in terms of the convergence. As a result, the convergence of a multilayered NN is very low compared with a 3-layered NN. However, a multilayered NN extracts two meanings from learning data such as shape and density,.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.