Abstract

We introduce a Direct L2 Support Vector Machine (DL2 SVM) classifier and present the performances of its different implementation algorithms on 12 real binary and multi-class datasets. DL2 SVM algorithm is based on solving the Nonnegative Least Squares (NNLS) problem which finds a desired solution in much less CPU time than it is required by other SVM methods based on solving quadratic programming (QP) problem. Two techniques for solving NNLS problem originating in DL2 SVM algorithm are the Cholesky decomposition with an update, and Conjugate Gradient method. Both of them produce high and similar classification accuracy within the very strict nested cross-validation (a.k.a. double re-sampling) experimental environment. Similarities and differences of the two NNLS problem solving techniques variants are pointed at. Their performances are compared in terms of accuracy, percentage of support vectors and CPU time used.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call