Abstract
Ensemble methods have been successfully used as a classification scheme. The reduction of the complexity of this popular learning paradigm motivated the appearance of ensemble pruning algorithms. This paper presents a new ensemble pruning method which highly reduces the complexity of ensemble methods and performs better than complete bagging in terms of classification accuracy. More importantly, it is a very fast algorithm. It consists in ordering all base classifiers with respect to a new criterion which exploits an unsupervised ensemble margin. This method highlights the major influence of low margin instances on the performance of the pruning task and, more generally, the potential of low margin instances for the design of better ensembles. Comparison to both the naive approach of randomly pruning base classifiers and another ordering-based pruning algorithm is carried out in an extensive empirical analysis.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.