Abstract

Starting with two hidden units, we train a simple single hidden layer feedforward neural network to solve the n-bit parity problem. If the network fails to recognize correctly all the input patterns, an additional hidden unit is added to the hidden layer and the network is retrained. This process is repeated until a network that correctly classifies all the input patterns has been constructed. Using a variant of the quasi-Newton methods for training, we have been able to find networks with a single layer containing less than n hidden units that solve the n-bit parity problem for some value of n. This proves the power of combining quasi-Newton method and node incremental approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call