Abstract

Least squares support vector machines (LS-SVMs) express the training in terms of solving a system of linear equations or an equivalent quadratic program (QP) with one linear equality constraint, in contrast to a QP with lower and upper bounds and one linear equality constraint for conventional support vector machines (SVMs). But for large scale problems, the presence of the linear equality constraint impedes the applications of some well developed methods. In this paper, we first eliminate the linear equality constraint of the QP in training LS-SVM and make it an unconstrained one, then propose a fast iterative single data approach with stepsize acceleration to the unconstrained QP. As a result of combining the selection rule of variables with the coordinate descent approach, the proposed approach is superior to the successive over-relaxation (SOR) method. Meanwhile updating only one variable at each iteration makes the proposed approach simpler and more flexible than the sequential minimal optimization (SMO) method. Computational experiment results on several benchmark data sets show that the proposed approach is more efficient than the existing single data approach and the SMO methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call