Abstract
The CA3 region of the hippocampus is a recurrent neural network that is essential for the storage and replay of sequences of patterns that represent behavioral events. Here we present a theoretical framework to calculate a sparsely connected network's capacity to store such sequences. As in CA3, only a limited subset of neurons in the network is active at any one time, pattern retrieval is subject to error, and the resources for plasticity are limited. Our analysis combines an analytical mean field approach, stochastic dynamics, and cellular simulations of a time-discrete McCulloch-Pitts network with binary synapses. To maximize the number of sequences that can be stored in the network, we concurrently optimize the number of active neurons, that is, pattern size, and the firing threshold. We find that for one-step associations (i.e., minimal sequences), the optimal pattern size is inversely proportional to the mean connectivity c, whereas the optimal firing threshold is independent of the connectivity. If the number of synapses per neuron is fixed, the maximum number P of stored sequences in a sufficiently large, nonmodular network is independent of its number N of cells. On the other hand, if the number of synapses scales as the network size to the power of 3/2, the number of sequences P is proportional to N. In other words, sequential memory is scalable. Furthermore, we find that there is an optimal ratio r between silent and nonsilent synapses at which the storage capacity alpha = P//[c(1 + r)N] assumes a maximum. For long sequences, the capacity of sequential memory is about one order of magnitude below the capacity for minimal sequences, but otherwise behaves similar to the case of minimal sequences. In a biologically inspired scenario, the information content per synapse is far below theoretical optimality, suggesting that the brain trades off error tolerance against information content in encoding sequential memories.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.