Abstract

Twin support vector regression (TSVR) was proposed recently as a novel regressor that tries to find a pair of nonparallel planes, i.e. $$\epsilon $$ ∈ -insensitive up- and down-bounds, by solving two related SVM-type problems. Though TSVR exhibits good performance compared with conventional methods like SVR, it suffers from the following issues: (1) it lacks model complexity control and thus may incur overfitting and suboptimal solution; (2) it needs to solve a pair of quadratic programming problems which are relatively complex to implement; (3) it is sensitive to outliers; and (4) its solution is not sparse. To address these problems, we propose in this paper a novel regression algorithm termed as robust and sparse twin support vector regression. The central idea is to reformulate TSVR as a convex problem by introducing regularization technique first and then derive a linear programming (LP) formulation which is not only simple but also allows robustness and sparseness. Instead of solving the resulting LP problem in the primal, we present a Newton algorithm with Armijo step-size to resolve the corresponding exact exterior penalty problem. The experimental results on several publicly available benchmark data sets show the feasibility and effectiveness of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call