Abstract

A model describing the dynamics of the synaptic weights of a single neuron performing Hebbian learning is described. The neutron is repeatedly excited by a set of input patterns. Its response is modelled as a continuous, nonlinear function of its excitation. The authors study how the model forms a self-organized representation of the set of input patterns. The dynamical equations ae solved directly in a few simple cases. The model is studied for random patterns by a signal-to-noise analysis and by introducing a partition function and applying the replica approach. As the number of patterns is increased a first-order phase transition occurs where the neuron becomes unable to remember one pattern but learns instead a mixture of very many patterns. The critical number of patterns for this transition scales as Nb, where N is the number of synapses and b is the degree of nonlinearity. The leading order finite-size corrections are calculated and compared with numerical simulations. It is shown how the representation of the input patterns learned by the neutron depends upon the nonlinearity in the neuron's response. Two types of behaviour can be identified depending on the degree of nonlinearity; either the neuron learns to discriminate one pattern from all the others, or it will learn to discriminate a complex mixture of many of the patterns.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.