Abstract

We study generalization in a fully connected two-layer neural network with multiple output nodes. Similar to the learning of fully connected committee machine, the learning is characterized by a discontinuous phase transition between the permutation symmetric phase and the permutation symmetry breaking phase. We find that the learning curve in the permutation symmetric phase is universal, irrespective of the number of output nodes. The first-order phase transition point, i.e., the critical number of examples required for perfect learning, is inversely proportional to the number of outputs. The replica calculation shows good agreement with Monte Carlo simulation. \textcopyright{} 1996 The American Physical Society.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call