Abstract

The ensemble learning methods have always been paid attention to their successful performance in handling supervised classification problems. Nevertheless, some deficiencies, such as inadequate diversity between classifiers and existing redundant classifiers, are among the main challenges in this kind of learning. In recent years, a method called density peak has been used in clustering methods to improve this process, which selects cluster centers from the local density peak. In this paper, inspiring this matter, and using the density peak criterion, a new method is proposed to create parallel ensembles. This criterion creates diverse training sets resulting in the generation of diverse classifiers. In the proposed method, during a multi-objective evolutionary decomposition-based optimization process, some (near) optimum diverse training datasets are created to improve the performance of the non-sequential ensemble learning methods. To do so, in addition to density peak as the first objective, the accuracy criterion is used as the second objective function. To show the superiority of the proposed method, it has been compared with the state-of-the-art methods over 19 datasets. To conduct a better comparison, non-parametric statistical tests are used, where the obtained results demonstrate that the proposed method can significantly dominate the other employed methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call