Abstract

Recently, a modified version of support vector machines (SVMs), least-squares SVM (LS-SVM) classifiers, has been introduced, which is closely related to a form of ridge regression-type SVMs. In LS-SVMs, the classifier is obtained as the solution to a linear system instead of a quadratic programming problem. In this paper, UCI (University of California at Irvine) benchmark data sets are used to evaluate the performance of LS-SVM classifiers with linear, polynomial and radial basis function (RBF) kernels. The hyperparameters of the LS-SVM problem formulation are tuned using a 10-fold cross-validation procedure and a grid search mechanism. When comparing the performance of a nonlinear (RBF or polynomial) LS-SVM classifier with that of a linear LS-SVM, additional insight can be gained into the degree of nonlinearity of the classification problem at hand. Using a statistical motivation, it is concluded that RBF LS-SVM classifiers consistently yield among the best results for each data set.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call