Abstract

Data stream classification is of great significance to numerous real-world scenarios. Nevertheless, the prevalent data stream classification techniques are influenced by concept drift and demonstrate unreliability in non-stationary environments. Ensemble models are typically successful when they increase diversity among their members. Several ensembles that enhance diversity have been proposed in literatures. Regrettably, there is no established method to verify that cooperativity indeed improves performance. In response to this knowledge gap, we have developed an innovative ensemble learning framework driven by diversity and cooperativity, termed EDDC, to address the issue. EDDC first dynamically maintains multiple groups of classifiers, with primary classifier in each group chosen to enhance diversity. Next, cooperativity is employed to update groups and replace outdated members. Finally, when environment changes, EDDC adaptively selects either diversity or cooperativity as the strategy for predicting labeling of new instances, while also establishing an excellent performance guarantee. Through simulation experiments, we assessed the performance of EDDC and the benefits of cooperativity for enhancing prediction. The results demonstrated that EDDC is efficient and robust in most scenarios, particularly when dealing with gradual drift. Furthermore, EDDC maintains a competitive edge in terms of classification accuracy and other metrics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call