Abstract

Boosting, one of the best off-the-shelf classification methods, has evoked widespread interest in machine learning and statistics. However, the original algorithm was developed for binary classification problems. In this paper, we study multi-class boosting algorithms under the [Formula: see text]-loss framework, and devise two multi-class [Formula: see text]-Boost algorithms. These are based on coordinate descent and gradient descent to minimize the multi-class [Formula: see text]-loss function. We derive a scoring coding scheme using optimal scoring constraints to encode class labels and a simple decoder to recover the true class labels. Our boosting algorithms are easily implemented and their results converge to the global optimum. Experiments with synthetic and real-world datasets show that, compared with several state-of-art methods, our algorithms provide more accurate results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.