Abstract

By using the statistical mechanics formulation of Amit, Gutfreund and Sompolinsky (1985) the authors investigate the retrieval properties of a model for neural networks that exhibits the same organization into clusters as Dyson's hierarchical model for ferromagnetism, combined with Hebb's learning algorithm for a non-extensive number of stored patterns. They show that if the number of clusters l<or=4 the model is able to retrieve perfectly a family of 'descendants' together with a 'pure' embedded pattern, which appear as local minima of the free energy through a series of discontinuous transitions when the temperature (noise) is reduced. The highest critical temperature is for retrieval of the embedded patterns, that occur through a second order, continuous transition and remain as the global minima of the free energy. The 'descendants' differ from the 'ancestor' in the signs of the cluster overlaps. However, when the number of partitions in clusters increases there appear 'blurred' solutions, that consist in an arbitrary mixture of 'descendants' of a given pattern and may hinder the perfect retrieval. The number n(l) of mixed solutions increases exponentially with the cluster number, n(l) approximately=e0.45l, for large values of l.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call