Abstract

The classical support vector machines are constructed based on convex loss functions. Recently, support vector machines with non-convex loss functions have attracted much attention for their superiority to the classical ones in generalization accuracy and robustness. In this paper, we propose a non-convex loss function to construct a robust support vector regression (SVR). The introduced non-convex loss function includes several truncated loss functions as its special cases. The resultant optimization problem is a difference of convex functions program. We employ the concave–convex procedure and develop a Newton-type algorithm to solve it, which can both retain the sparseness of SVR and oppress outliers in the training samples. The experiments on both synthetic and real-world benchmark data sets confirm the robustness and effectiveness of the proposed method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.