Abstract

In this research, we propose a variant of the firefly algorithm (FA) for classifier ensemble reduction. It incorporates both accelerated attractiveness and evading strategies to overcome the premature convergence problem of the original FA model. The attractiveness strategy takes not only the neighboring but also global best solutions into account, in order to guide the firefly swarm to reach the optimal regions with fast convergence while the evading action employs both neighboring and global worst solutions to drive the search out of gloomy regions. The proposed algorithm is subsequently used to conduct discriminant base classifier selection for generating optimized ensemble classifiers without compromising classification accuracy. Evaluated with standard, shifted, and composite test functions, as well as the Black-Box Optimization Benchmarking test suite and several high dimensional UCI data sets, the empirical results indicate that, based on statistical tests, the proposed FA model outperforms other state-of-the-art FA variants and classical metaheuristic search methods in solving diverse complex unimodal and multimodal optimization and ensemble reduction problems. Moreover, the resulting ensemble classifiers show superior performance in comparison with those of the original, full-sized ensemble models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call