Abstract

It is well known that neural networks (NN) with backpropagation (BP) are used for recognition and learning. The basic networks have three layers, input layer, one hidden layer and output layer, and the scale a of 3-layered NN depends on the number of hidden layer units (fixed number of input and output layer units on NN). In this paper the authors make a multi (4,5)-layered NN with four or five layers on BP (input layer, two or three hidden layers, output layer) and try to compare a 3-layered NN and a multi-layered NN, in terms of the convergence. As a result, the convergence of a multilayered NN is very low compared with a 3-layered NN. However, a multilayered NN extracts two meanings from learning data such as shape and density,.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call