Abstract

We propose a novel and fast algorithm to train support vector machines (SVMs) in primal space, which solves an approximate optimization of SVMs with the properties of unconstraint, continuity and twice differentiability by utilizing the Newton optimization technique. Further, we devise a special pre-extracting procedure to speed up the convergence of the algorithm by resorting to a high-quality initial solution. Theoretical studies show that the proposed algorithm produces an ɛ -approximate solution to standard SVMs and maintains low computational complexity. Experimental results on benchmark data sets demonstrate that our algorithm is much faster than the dual based method such as SVM light while it achieves the similar test accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.