Abstract

The authors consider memory retrieval in a network of M modules. A module consists of N neuronal units, each of which is connected to all N-1 other units within the same module, and to L units distributed randomly throughout all the other modules. Both short- and long-range connections are symmetric. The units are threshold-linear with a continuous positive output. Each module can retrieve one of D local activity patterns, or 'features', stored on the corresponding short-range connections. Furthermore, P global activity patterns, each consisting of combinations of M local features, are stored on the dilute long-range connections. When M>>1 the long-range connections endow the network with attractor states correlated with a single global pattern, and they study its storage capacity within a mean-field approach. If P=D, and each feature appears in only one pattern, their model reduces to an intermediate case between fully connected and highly dilute architectures, whose capacities they recover in the appropriate limits. As P/D takes larger (integer) values, the maximum P grows, but it remains asymptotically proportional to N rather than to L+N-1 (the total number of connections per unit). The maximum amount of retrievable information per synapse, on the other hand, decreases. Moreover, as P/D grows, retrieval attractors have to compete with a 'memory glass' state, involving the retrieval of spurious combinations of features, whose existence and stability they describe analytically. They suggest implications for neocortical memory functions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call