Abstract

In this paper, we propose a novel nonparallel classifier, named sparse nonparallel support vector machine (SNSVM), for binary classification. Different with the existing nonparallel classifiers, such as the twin support vector machines (TWSVMs), SNSVM has several advantages: It constructs two convex quadratic programming problems for both linear and nonlinear cases, which can be solved efficiently by successive overrelaxation technique; it does not need to compute the inverse matrices any more before training; it has the similar sparseness with standard SVMs; it degenerates to the TWSVMs when the parameters are appropriately chosen. Therefore, SNSVM is certainly superior to them theoretically. Experimental results on lots of data sets show the effectiveness of our method in both sparseness and classification accuracy and, therefore, confirm the above conclusions further.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call