Abstract

Ensemble pruning has been widely used for enhancing classification ability employing a smaller number of classifiers. Ensemble pruning extracts a part of classifiers with good overall performance to form the final ensemble. Diversity and accuracy of classifiers are of vital importance for a successful ensemble. It is hard for the members in one ensemble to achieve both good diversity and high accuracy, simultaneously, because there is a tradeoff between them. Existing works usually search for the tradeoff in terms of diversity measures, or find it utilizing heuristic algorithms, which cannot gain the exact solution without exhaustive search. To address the above issue, a novel ensemble pruning method based on information exchange glowworm swarm optimization and complementarity measure, abbreviated EPIECM, is proposed using the combination of information exchange glowworm swarm optimization (IEGSO) and complementarity measure (COM). Firstly, multiple generated classifiers are utilized to construct a pool of learners who perform diversely. Secondly, COM is employed to pre-prune the classifiers with poor comprehensive performance, and the pre-pruned ensemble is formed utilizing the retaining classifiers. Finally, the optimal subset of classifiers is combined from the remaining constituents after pre-pruning with IEGSO. Empirical results on 27 UCI datasets indicate that EPIECM significantly outperforms other techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call