Abstract

We focus on methods to solve multiclass learning problems by using only simple and efficient binary learners. We investigate the approach of Dietterich and Bakiri [2] based on error-correcting codes (which we call ECC). We distill error correlation as one of the key parameters influencing the performance of the ECC approach, and prove upper and lower bounds on the training error of the final hypothesis in terms of the error-correlation between the various binary hypotheses. Boosting is a powerful and well-studied learning technique that appears to annul error correlation disadvantages by cleverly weighting training examples and hypotheses. An interesting algorithm called ADABOOST.OC [12] combines boosting with the ECC approach and gives an algorithm that has the performance advantages of boosting and at the same time relies only on simple binary weak learners. We propose a variant of this algorithm, which we call ADABOOST.ECC, that, by using a different weighting of the votes of the weak hypotheses, is able to improve on the performance of ADABOOST.OC, both theoretically and experimentally, and in addition is arguably a more direct reduction of multiclass learning to binary learning problems than previous multiclass boosting algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.