Abstract

Least squares support vector machine (LSSVM) has comparable performance with support vector machine (SVM) and it has been widely used for classification and regression problems. The solutions of LSSVM are obtained by solving linear equations, but it lack of sparseness, which result in that it is unable to handle large-scale data sets. The state-of-art method least angle regression (LARS) can obtain a sparse solution by solving the Least Absolute Shrinkage and Selection Operator (LASSO) problem. So we use the idea of the LARS to obtain the sparse solution of the LSSVM, i.e., RLARS-LSSVM is proposed, which is an efficient method. The feature of the method is to select the most important samples as support vectors iteratively and to remove the samples that are similar to the selected support vectors simultaneously. Experimental results show that the proposed method can obtain much higher test accuracy compared with other sparse LSSVM methods at the same number of support vectors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call