Abstract

The [Formula: see text]-insensitive-Least Squares Support Vector Regression ([Formula: see text]-LSSVR) optimization problem with [Formula: see text]-insensitive quadratic loss function is a non-smooth, piecewise quadratic complex problem. Therefore, the optimal solution of [Formula: see text]-LSSVR takes longer than Classic LSSVR. Additionally, due to its unbounded influence function, [Formula: see text]-LSSVR, like classical LSSVR, shows poor generalization ability in the presence of outliers. To overcome the above-mentioned drawbacks, this paper presents an [Formula: see text]-insensitive Weighted Least Squares Support Vector Regression ([Formula: see text]-WLSSVR) with equality constraints to increase robustness against outliers. To reduce the computational complexity and time, the optimization problem of [Formula: see text]-WLSSVR is solved by the Sequential Minimal Optimization (SMO) algorithm. Extensive simulations have shown that the presented [Formula: see text]-WLSSVR model combined with the SMO algorithm is more effective and efficient than LSSVR and its known sparse variants.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call