Abstract

Recently, several classification algorithms capable of dealing with potentially infinite data streams have been proposed. One of the main challenges of this task is to continuously update predictive models to address concept drifts without compromise their predictive performance. Moreover, the classification algorithm used must be able to efficiently deal with processing time and memory limitations. In the data stream mining literature, ensemble-based classification algorithms are a good alternative to satisfy the previous requirements. These algorithms combine multiple weak learner algorithms, e.g., the Very Fast Decision Tree (VFDT), to create a model with higher predictive performance. However, the memory costs of each weak learner are stacked in an ensemble, compromising the limited space requirements. To manage the trade-off between accuracy, memory space, and processing time, this paper proposes to use the Strict VFDT (SVFDT) algorithm as an alternative weak learner for ensemble solutions which is capable of reducing memory consumption without harming the predictive performance. This paper experimentally compares two traditional and three state-of-the-art ensembles using as weak learners the VFDT and SVFDT across thirteen benchmark datasets. According to the experimental results, the proposed algorithm can obtain a similar predictive performance with a significant economy of memory space.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call