Abstract

The support vector machine solves a quadratic programming problem with linear inequality and equality constraints. However, it is not trivial to solve the quadratic problem. The least squares support vector machine(LS-SVM) solves a linear system by equality constraints instead of inequality constraints. LS-SVM is a popular method in regression and classification problems, because it effectively solves simple linear systems. There are two issues with the LS-SVM solution : the lack of robustness to outliers and the absence of sparseness. In this paper, we propose a sparse and robust support vector machine for regression problems using the least absolute deviation support vector machine (LAD-SVM) and recursive reduced LS-SVM (RR-LS-SVM). The split-Bregman iteration gives the exact solution for the LAD-SVM problem, while RR-LS-SVM gives a sparse solution with a much smaller number of all support vectors. Numerical experiments with simulation and benchmark data demonstrate that the proposed algorithm can achieve comparable performance to other methods in terms of robustness and sparseness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call