Abstract

In least square support vector regression (LSSVR), Vapnik's original SVR formulation has been modified by using a cost function which corresponds to a form of ridge regression rather than ε-insensitive loss function. As a result, nonlinear function estimation is done by solving linear set of equations instead of solving a time-consuming quadratic programming problem. When the gradient/Hessians in samples can be obtained cheaply, it should be considered in the construction of metamodels. In this paper, the gradient/Hessian-enhanced LSSVR (G/HELSSVR) is developed through incorporating gradient/Hessian information into the traditional LSSVR. The performance of this method is tested by analytical function fitting. The experimental results illustrate that the proposed G/HELSSVR model has a great advantages over the traditional LSSVR and gradient-enhanced LSSVR (GELSSVR).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.