Abstract

Learning of layered neural networks is studied using the methods of statistical mechanics. Networks are trained from examples using the Gibbs algorithm. We focus on the generalization curve, i.e. the average generalization error as a function of the number of the examples. We consider perceptron learning with a sigmoid transfer function. Ising perceptrons, with weights constrained to be discrete, exhibit sudden learning at low temperatures within the annealed approximation. There is a first order transition from a state of poor generalization to a state of perfect generalization. When the transfer function is smooth, the first order transition occurs only at low temperatures. The transition becomes continuous at high temperatures. When the transfer function is steep, the first order transition line is extended to the higher temperature. The analytic results show a good agreement with the computer simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call