Abstract
Support vector machines can be posed as quadratic programming problems in a variety of ways. This paper investigates the 2-norm soft margin SVM with an additional quadratic penalty for the bias term that leads to a positive definite quadratic program in feature space only with the nonnegative constraint. An unconstrained programming problem is proposed as the Lagrangian dual of the quadratic programming for the linear classification problem. The resulted problem minimizes a differentiable convex piecewise quadratic function with lower dimensions in input space, and a Semismooth Newton algorithm is introduced to solve it quickly, then a Semismooth Newton Support Vector Machine (SNSVM) is presented. After the kernel matrix is factorized by the Cholesky factorization or the incomplete Cholesky factorization, the nonlinear kernel classification problem can also be solved by SNSVM, and the complexity of the algorithms has no apparent increase. Many numerical experiments demonstrate that our algorithm is comparable with the similar algorithms such as Lagrangian Support Vector Machines (LSVM) and Semismooth Support Vector Machines (SSVM).
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have