Abstract

Least Squares Support Vector Regression (LS-SVR) is a powerful kernel-based learning tool for regression problems. However, since it is based on the ordinary least squares (OLS) approach for parameter estimation, the standard LS-SVR model is very sensitive to outliers. Robust variants of the LS-SVR model, such as the WLS-SVR and IRLS-SVR models, have been developed aiming at adding robustness to the parameter estimation process, but they still rely on OLS solutions. In this paper we propose a totally different approach to robustify the LS-SVR. Unlike previous models, we maintain the original LS-SVR loss function, while the solution of the resulting linear system for parameter estimation is obtained by means of the Recursive Least M-estimate (RLM) algorithm. We evaluate the proposed approach in nonlinear system identification tasks, using artificial and real-world datasets contaminated with outliers. The obtained results for infinite-steps-ahead prediction shows that proposed model consistently outperforms the WLS-SVR and IRLS-SVR models for all studied scenarios.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.