Abstract
The problem we address in this paper is that of finding effective and parsimonious patterns of connectivity in sparse associative memories. This problem must be addressed in real neuronal systems, so that results in artificial systems could throw light on real systems. We show that there are efficient patterns of connectivity and that these patterns are effective in models with either spiking or non-spiking neurons. This suggests that there may be some underlying general principles governing good connectivity in such networks. We also show that the clustering of the network, measured by Clustering Coefficient, has a strong negative linear correlation to the performance of associative memory. This result is important since a purely static measure of network connectivity appears to determine an important dynamic property of the network.
Highlights
Network models for associative memories store the information to be retrieved in the values of the synaptic weights
The main questions are how a desired memory can be stored by making it an attractor of the network and how many patterns can be stored and retrieved, within a given error margin, in a network with a given number of neurons and synapses
The network model for associative memory Network connectivity For each model investigated, a collection of N artificial neurons is placed on a line such that the nodes are spaced
Summary
Network models for associative memories store the information to be retrieved in the values of the synaptic weights. Weighted summation of their synaptic inputs allows the neurons to transform any input pattern into an associated output pattern. Like the one-layer perceptron, a static input pattern in the afferent fibers is in one step transformed, by weighted summation, into a static pattern of activity of the neurons in the output layer. The main questions are how a desired memory can be stored by making it an attractor of the network (the learning rule determining the synaptic weights) and how many patterns can be stored and retrieved, within a given error margin, in a network with a given number of neurons and synapses
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have