Abstract

In the interest of deriving regressor that is robust to outliers, we propose a support vector regression (SVR) based on non-convex quadratic insensitive loss function with flexible coefficient and margin. The proposed loss function can be approximated by a difference of convex functions (DC). The resultant optimization is a DC program. We employ Newton’s method to solve it. The proposed model can explicitly enhance the robustness and sparseness of SVR. Numerical experiments on six benchmark data sets show that it yields promising results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call