Abstract

Continual learning remains an unsolved problem in artificial neural networks. The brain has evolved mechanisms to prevent catastrophic forgetting of old knowledge during new training. Building upon data suggesting the importance of sleep in learning and memory, we tested a hypothesis that sleep protects old memories from being forgotten after new learning. In the thalamocortical model, training a new memory interfered with previously learned old memories leading to degradation and forgetting of the old memory traces. Simulating sleep after new learning reversed the damage and enhanced old and new memories. We found that when a new memory competed for previously allocated neuronal/synaptic resources, sleep replay changed the synaptic footprint of the old memory to allow overlapping neuronal populations to store multiple memories. Our study predicts that memory storage is dynamic, and sleep enables continual learning by combining consolidation of new memory traces with reconsolidation of old memory traces to minimize interference.

Highlights

  • Animals and humans are capable of continuous, sequential learning

  • Our results suggest that sleep provides a powerful mechanism to achieve continual learning by combining consolidation of new memory traces with reconsolidation of old memory traces to minimize memory interference

  • We found that sleep was able to reverse the damage caused to sequence 1 (S1) following S1* training, but it was able to enhance all previously trained memory sequences S1/sequence 2 (S2)/S1*

Read more

Summary

Introduction

Animals and humans are capable of continuous, sequential learning. In contrast, modern artificial neural networks suffer from the inability to perform continual learning (French, 1999; Hassabis et al, 2017; Hasselmo, 2017; Kirkpatrick et al, 2017; Ratcliff, 1990). Several attempts have been made to overcome this problem including (1) explicit retraining of all previously learned memories – interleaved training (Hasselmo, 2017), (2) using generative models to reactivate previous inputs (Kemker and Kanan, 2017), (3) artificially “freezing” subsets of synapses critical for the old memories (Kirkpatrick et al, 2017) These solutions help prevent new memories from interfering with the previously stored old memories, they either require explicit retraining of all old memories using the original data or have limitations on the types of trainable new memories and network architectures (Kemker et al, 2017). We propose a mechanism for how sleep modifies the synaptic connectivity matrix to minimize interference of competing memory traces enabling continual learning

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call