Abstract

The multicategory support vector machine (MSVM) has been widely used for multicategory classification. Despite its widespread popularity, regular MSVM cannot provide direct probabilistic results and suffers from excessive computational cost, as it is formulated on the hinge loss function and it solves a sum-to-zero constrained quadratic programming problem. In this study, we propose a general refinement of regular MSVM, termed as the simplex-based proximal MSVM (SPMSVM). Our SPMSVM uses a novel family of squared error loss functions in place of the hinge loss and it removes the explicit sum-to-zero constraint by the simplex structure. Consequently, the SPMSVM only requires solving an unconstrained linear system, leading to closed-form solutions. In addition, the SPMSVM can be cast into a weighted regression problem so that it is scalable for large-scale applications. Moreover, the SPMSVM naturally yields an estimate of the conditional category probability, which is more informative than regular MSVM. Theoretically, the SPMSVM is shown to include many existing MSVMs as its special cases, and its asymptotic and finite-sample statistical properties are well established. Simulations and real examples show that the proposed SPMSVM is a stable, scalable and competitive classifier.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.