Abstract
We introduce a Direct L2 Support Vector Machine (DL2 SVM) classifier and present the performances of its different implementation algorithms on 12 real binary and multi-class datasets. DL2 SVM algorithm is based on solving the Nonnegative Least Squares (NNLS) problem which finds a desired solution in much less CPU time than it is required by other SVM methods based on solving quadratic programming (QP) problem. Two techniques for solving NNLS problem originating in DL2 SVM algorithm are the Cholesky decomposition with an update, and Conjugate Gradient method. Both of them produce high and similar classification accuracy within the very strict nested cross-validation (a.k.a. double re-sampling) experimental environment. Similarities and differences of the two NNLS problem solving techniques variants are pointed at. Their performances are compared in terms of accuracy, percentage of support vectors and CPU time used.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.