Abstract
Ensemble methods are often able to generate more accurate classifiers than the individual classifiers. In multiclass problems, it is possible to obtain an ensemble combining binary classifiers. It is sensible to use a multiclass method for constructing the binary classifiers, because the ensemble of binary classifiers can be more accurate than the individual multiclass classifier. Ensemble of nested dichotomies (END) is a method for dealing with multiclass classification problems using binary classifiers. A nested dichotomy organizes the classes in a tree, each internal node has a binary classifier. A set of classes can be organized in different ways in a nested dichotomy. An END is formed by several nested dichotomies. This paper studies the use of this method in conjunction with ensembles of decision trees (forests). Although forests methods are able to deal directly with several classes, their accuracies can be improved if they are used as base classifiers for ensembles of nested dichotomies. Moreover, the accuracies can be improved even more using forests of nested dichotomies, that is, ensemble methods that use as base classifiers a nested dichotomy of decision trees. The improvements over forests methods can be explained by the increased diversity of the base classifiers. The best overall results were obtained using MultiBoost with resampling.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.