Abstract

A general mean-field theory is presented for an attractor neural network in which each elementary unit is described by one input and one output real variable, and whose synaptic strengths are determined by a covariance imprinting rule. In the case of threshold-linear units, a single equation is shown to yield the storage capacity for the retrieval of random activity patterns drawn from any given probability distribution. If this distribution produces binary patterns, the storage capacity is essentially the same as for networks of binary units. To explore the effects of storing more structured patterns, the case of a ternary distribution is studied. It is shown that the number of patterns that can be stored can be much higher than in the binary case, whereas the total amount of retrievable information does not exceed the limit obtained with binary patterns.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.