Abstract

One of the simplest optimization problems solved by Ising spin models of neural memory is associative memory retrieval. The authors study deterministic convergence properties of the Hopfield synchronous retrieval algorithm for such models. In this case a memory, stored in the network by an appropriate choice of connections, is retrieved by setting the neural outputs to the binary pattern of the recall key (probe) and allowing the network to converge to a stable state. Precise conditions are developed that ensure that all stored memories are fixed points of the retrieval algorithm. An orthogonality-nearness criterion is then obtained for a memory probe itself to be a stationary point and thus outside the error-correcting capability of the memory. A local stability result quantifies the spatial relationship required for fast convergence.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call