Abstract

In the context of Attractor Neural Networks (ANNs), we will be concerned with the following issue: assuming the interaction (synaptic) matrix to be the simple Hebb-Hopfield correlation matrix, we discuss how the storage performance of an ANN may depend on the equilibrium analogue neural activities reached by the dynamics during memory retrieval. In both discrete and analogue Hopfield-like attractor neural networks, the phase transition of the system from associative memory to spin-glass, is due to temporal correlations arising from the static noise produced by the interference between the retrieved pattern and the other stored memories. The introduction of a suitable cost-function in the space of neural activities allows us to study how such a static noise may be reduced and to derive a class of simple response functions for which the dynamics stabilizes the ‘ground-state’ neural activities, i.e. the ones that minimize the cost function, up to a number of stored patterns equal to α*N(N =number of neurons),where α.* ∈ [0,0.41] depends on the average activity in the network.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call