Abstract

This paper presents two novel second-order cone programming (SOCP) formulations that determine a linear predictor using Support Vector Machines (SVMs). Inspired by the soft-margin SVM formulation, our first approach (ξ-SOCP-SVM) proposes a relaxation of the conic constraints via a slack variable, penalizing it in the objective function. The second formulation (r-SOCP-SVM) is based on the LP-SVM formulation principle: the bound of the VC dimension is loosened properly using the l∞-norm, and the margin is directly maximized. The proposed methods have several advantages: The first approach constructs a flexible classifier, extending the benefits of the soft-margin SVM formulation to second-order cones. The second method obtains comparable results to the SOCP-SVM formulation with less computational effort, since one conic restriction is eliminated. Experiments on well-known benchmark datasets from the UCI Repository demonstrate that our approach accomplishes the best classification performance compared to the traditional SOCP-SVM formulation, LP-SVM, and to standard linear SVM.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.