Abstract

The hippocampal formation plays a crucial role in organizing cortical long-term memory. It is believed that the hippocampus is capable of fast (one-shot) learning of new episodic information followed by extensive time periods where corresponding neocortical representations are trained and compressed [1]. Here, compression usually refers to processes such as chunking spatially and temporally distributed activity patterns. We take the complementary approach and optimize the synaptic network by structural plasticity, e.g., replacing unused synapses, thereby making full use of the potential connectivity [2]. We apply the frameworks of structural plasticity and hippocampus-induced learning to the training of neocortical associative networks [3]. Associative networks such as the Hopfield or Willshaw model are at the heart of many cortex theories and have been analyzed for a long time with respect to information storage capacity and plausible retrieval strategies [3,4]. For example, it is well known that a completely connected network can store about 0.7 bits per synapse. However, for incompletely connected networks the capacity per synapse can be massively reduced or even vanish, depending on the retrieval algorithm [4]. In this work we analyze how structural processes and synaptic consolidation [5] during hippocampal training can improve the performance of neocortical associative networks by emulating full (or increased) synaptic connectivity. In our model the hippocampus can store a set of activity patterns by one-shot learning. Then the hippocampus trains the neocortex by repeatedly replaying the patterns in a sequence. The synapses of the neocortical network are consolidated depending on Hebbian learning. In each time step a fraction of the unconsolidated synapses are removed and replaced by the same number of new synapses at random locations thereby maintaining total connectivity. We show that this procedure can massively increase the synaptic capacity of a cortical macrocolumn (factor 10–20 or even up to factor 200 for pattern capacity). In a second step we analyze the model with respect to the time (or number of repetitions) necessary to increase effective connectivity from base level to a desired level. The analysis shows that acceptable training time requires a certain fraction of unconsolidated synapses to keep the network plastic.

Highlights

  • Sixteenth Annual Computational Neuroscience Meeting: CNS*2007 William R Holmes Meeting abstracts – A single PDF containing all abstracts in this Supplement is available here http://www.biomedcentral.com/content/pdf/1471-2202-8-S2-info.pdf

  • Compression usually refers to processes such as chunking spatially and temporally distributed activity patterns

  • We apply the frameworks of structural plasticity and hippocampus-induced learning to the training of neocortical associative networks [3]

Read more

Summary

Introduction

Sixteenth Annual Computational Neuroscience Meeting: CNS*2007 William R Holmes Meeting abstracts – A single PDF containing all abstracts in this Supplement is available here http://www.biomedcentral.com/content/pdf/1471-2202-8-S2-info.pdf . Address: 1Honda Research Institute Europe, D-63073 Offenbach, Germany and 2Redwood Center for Theoretical Neuroscience, University of California, Berkeley, CA, USA Email: Andreas Knoblauch* - andreas.knoblauch@honda-ri.de * Corresponding author from Sixteenth Annual Computational Neuroscience Meeting: CNS*2007 Toronto, Canada. It is believed that the hippocampus is capable of fast (one-shot) learning of new episodic information followed by extensive time periods where corresponding neocortical representations are trained and "compressed" [1].

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call