Abstract
A new smoothing strategy for solving /spl epsi/-support vector regression (/spl epsi/-SVR), tolerating a small error in fitting a given data set linearly or nonlinearly, is proposed in this paper. Conventionally, /spl epsi/-SVR is formulated as a constrained minimization problem, namely, a convex quadratic programming problem. We apply the smoothing techniques that have been used for solving the support vector machine for classification, to replace the /spl epsi/-insensitive loss function by an accurate smooth approximation. This will allow us to solve /spl epsi/-SVR as an unconstrained minimization problem directly. We term this reformulated problem as /spl epsi/-smooth support vector regression (/spl epsi/-SSVR). We also prescribe a Newton-Armijo algorithm that has been shown to be convergent globally and quadratically to solve our /spl epsi/-SSVR. In order to handle the case of nonlinear regression with a massive data set, we also introduce the reduced kernel technique in this paper to avoid the computational difficulties in dealing with a huge and fully dense kernel matrix. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Knowledge and Data Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.