Abstract

The authors consider the retrieval properties of attractor neural networks whose synaptic matrices have been constructed to maximise the number of patterns which can be stored in a perceptron satisfying certain constraints. Retrieval is studied in the absence as well as in the presence of fast noise (temperature). The discussion is restricted to dilute networks, for which dynamical equations for the overlaps are available. When the patterns are stored with a prescribed lower limit on the stability parameter on every site, the full stability of the perceptron ensures the existence of an attractor with perfect retrieval. It is found that the curve of critical storage capacity ( alpha ) against temperature (T) as a line of first-order transitions for high values of alpha and becomes second order for low alpha , at a point of a tricritical nature. The phase diagram is compared with the dilute Hopfield model. It is found that at high synaptic noise levels the diluted Hopfield net stores more effectively than the network trained for optimal perceptron storage. When a given fraction of sites is allowed to violate the stability bound, the solution of the perceptron 'learning' problem does not ensure the existence of an attractor of finite overlap even in the absence of noise. This case is studied separately for T=0 and for finite T.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.