Abstract

This paper is concerned with the robustness of concept formation systems in the presence of concept drift. By concept drift is meant that the intension of a concept is not stable during the period of learning, a restriction which is otherwise often imposed. The work is based upon the architecture of Cobweb, an incremental, probabilistic conceptual clustering system. When incrementally and sequentially exposed to the extensions of a set of concepts, Cobweb retains all examples, disregards the age of a concept and may create different conceptual structures dependent on the order of examples. These three characteristics make Cobweb sensitive to the effects of concept drift. Six mechanisms that can detect concept drift and adjust the conceptual structure are proposed. A variant of one of these mechanisms: dynamic deletion of old examples, is implemented in a modified Cobweb system called Cobbit. The relative performance of Cobweb and Cobbit in the presence of concept drift is evaluated. In the experiment the error index, i.e. the average of the ability to predict each attribute is used as the major instrument. The experiment is performed in a synthetical domain and indicates that Cobbit regain performance faster after a discrete concept shift.KeywordsTraining InstanceIncremental LearningQueue SizeConcept DriftConcept HierarchyThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call