Abstract

Ensemble learning is a powerful approach for achieving more accurate predictions compared with single classifier. However, this powerful classification ability is achieved at the expense of heavy storage requirements and computational burdens on the ensemble. Ensemble pruning is a crucial step for the reduction of the predictive overhead without worsening the performance of original ensemble. This paper suggests an efficient and effective ordering-based ensemble pruning based on the induction of decision tree. The suggested method maps the dataset and base classifiers to a new dataset where the ensemble pruning can be transformed to a feature selection problem. Furthermore, a set of accurate, diverse and complementary base classifiers can be selected by the induction of decision tree. Moreover, an evaluation function that deliberately favors the candidate sub-ensembles with an improved performance in classifying low margin instances has also been designed. The comparative experiments on 24 benchmark datasets demonstrate the effectiveness of our proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call