Abstract

Ensemble learning is a powerful machine learning paradigm which leverages a collection of diverse base learners to achieve better prediction performance than that could be achieved by any individual base learner. This work proposes an evolutionary feature subspaces generation based ensemble learning framework, which formulates the tasks of searching for the most suitable feature subspace for each base learner into a multi-task optimization problem and solve it via an evolutionary multi-task optimizer. Multiple such problems which correspond to different base learners are solved simultaneously via an evolutionary multi-task feature selection algorithm such that solving one problem may help solve some other problems via implicit knowledge transfer. The quality of thus generated feature subspaces is supposed to outperform those obtained by individually seeking the optimal feature subspace for each base learner. We implement the proposed framework by using SVM, KNN, and decision tree as the base learners, proposing a multi-task binary particle swarm optimization algorithm for evolutionary multi-task feature selection, and utilizing the major voting scheme to combine the outputs of the base learners. Experiments on several UCI datasets demonstrate the effectiveness of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call