Event Abstract Back to Event Memory formation, recall and forgetting in neuronal networks Christian Tetzlaff1*, Christoph Kolodziejski2, Marc Timme2, Misha Tsodyks3 and Florentin Wörgötter1 1 University of Goettingen, Biophysics - Faculty of Physics, Germany 2 Max Planck Insitute for Dynamics and Self-Organization, Germany 3 Weizmann Institute, Israel Which processes are behind the ability of biological neuronal circuits for memory formation, recall, and forgetting (e.g., objects or facts) is still under heavy debate in neuroscience. Although several theoretical approaches exist (e.g. Hopfield networks [Hopfield, 1982], attractor networks [Mongillo et al., 2008], liquid-state machines [Maass et al., 2002]), each approach has difficulties like, for instance, arbitrary non-biological constraints (e.g. Hopfield networks), predefined connectivity (e.g. attractor networks), or supervised recall (e.g. liquid-state machines). In Tetzlaff et al., 2011 we analyzed a combination of conventional plasticity [e.g. Hebb, 1949] and synaptic scaling [Turrigiano et al., 1998] that builds up memory traces (cell assemblies) induced by external inputs. Here we show that the combination of this mechanism with a cortex-like structure can form (“learn”), retrieve and delete clusters of highly connected neurons within recurrent neuronal networks. Synapses in such a cluster are significantly stronger than those between different clusters, thus similar connectivity patterns recently observed in the cortex [Perin et al., 2011]. In our model inter-cluster connection strength directly depends on duration and frequency of the presentation of an unknown entity. Thus, for complete recall of a well-learned (often presented) entity, a smaller fraction of cluster neurons have to be stimulated and recall is quick. Well-trained clusters are not only quantitatively but also qualitatively different from sporadically-trained clusters: The time scale for forgetting the former is significantly longer than for the latter so that it takes longer to forget a well-learned entity. Synaptic scaling in combination with conventional plasticity thus leads to 1) learning of new memory entities, 2) recalling of clusters while synaptic weights remain plastic which additionally supports memory formation through relearning, and 3) forgetting, which depends on the state (well- or sporadically-learned) the cluster is in.
Read full abstract