Abstract
We study generalization in a large fully connected committee machine with continuous weights trained on patterns with outputs generated by a teacher of the same structure but corrupted by noise. The corruption is due to additive Gaussian noise applied in the input layer or the hidden layer of the teacher. Contrary to related cases, in the presence of input noise the generalization error epsilon g is not minimized by the teacher`s weights. For small values of the load parameter alpha the student is in a permutation-symmetric phase. As alpha increases three additional phases emerge. The large- alpha theory of the stable phase is similar to the tree committee machine. In particular, at zero temperature in the presence of noise epsilon g does not approach its minimal value epsilon min and the student`s weights do not converge to those of the teacher. For a positive temperature epsilon g- epsilon min decays as a power of alpha , the exponent being the same as in the corresponding case of the tree. However, for all values of alpha an at least metastable phase exists which is permutation symmetric with respect to the teacher.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.