Abstract

Model-based learning systems such as neural networks usually “forget” learned skills due to incremental learning of new instances. This is because the modification of a parameter interferes with old memories. Therefore, to avoid forgetting, incremental learning processes in these learning systems must include relearning of old instances. The relearning process, however, is time-consuming. We present two types of incremental learning method designed to achieve quick adaptation with low resources. One approach is to use a sleep phase to provide time for learning. The other one involves a “meta-learning module” that acquires learning skills through experience. The system carries out “reactive modification” of parameters not only to memorize new instances, but also to avoid forgetting old memories using a meta-learning module.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.