Abstract

Concept-cognitive learning (CCL) is an emerging field of concerning incremental concept learning and dynamic knowledge processing in the context of dynamic environments. Although CCL has been widely researched in theory, the existing studies of CCL have one problem: the concepts obtained by CCL systems do not have generalization ability. In the meantime, the existing incremental algorithms still face some challenges that: 1) classifiers have to adapt gradually and 2) the previously acquired knowledge should be efficiently utilized. To address these problems, based on the advantage that CCL can naturally integrate new data into itself for enhancing flexibility of concept learning, we first propose a new CCL model (CCLM) to extend the classical methods of CCL, which is not only a new classifier but also good at incremental learning. Unlike the existing CCL systems, the theory of CCLM is mainly based on a formal decision context rather than a formal context. In learning concepts from dynamic environments, we show that CCLM can naturally incorporate new data into itself with a sufficient theoretical guarantee for incremental learning. For classification task and knowledge storage, our results on various data sets demonstrate that CCLM can simultaneously: 1) achieve the state-of-the-art static and dynamic classification task and 2) directly accomplish preservation of previously acquired knowledge (or concepts) under dynamic environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call