AbstractEnsemble methods are among the most effective concept‐drift adaptation techniques due to their high learning performance and flexibility. However, they are computationally expensive and pose a challenge in applications involving high‐speed data streams. In this paper, we present a computationally efficient heterogeneous classifier ensemble entitled OMS‐MAB which uses online model selection for concept‐drift adaptation by posing it as a non‐stationary multi‐armed bandit (MAB) problem. We use a MAB to select a single adaptive learner within the ensemble for learning and prediction while systematically exploring promising alternatives. Each ensemble member is made drift resistant using explicit drift detection and is represented as an arm of the MAB. An exploration factor controls the trade‐off between predictive performance and computational resource requirements, eliminating the need to continuously train and evaluate all the ensemble members. A rigorous evaluation on 20 benchmark datasets and 9 algorithms indicates that the accuracy of OMS‐MAB is statistically at par with state‐of‐the‐art (SOTA) ensembles. Moreover, it offers a significant reduction in execution time and model size in comparison to several SOTA ensemble methods, making it a promising ensemble for resource constrained stream‐mining problems.
Read full abstract