Abstract

In recent years neural networks as multilayer perceptrons and radial basis function networks have been frequently used in a wide range of fields, including control theory, signal processing and nonlinear modelling. A promising new methodology is Support Vector Machines (SVM), which has been originally introduced by Vapnik within the area of statistical learning theory and structural risk minimization. SVM approaches to classification, nonlinear function and density estimation lead to convex optimization problems, typically quadratic programming. However, due to their non-parametric nature, the present SVM methods were basically restricted to static problems. We discuss a method of least squares support vector machines (LS-SVM), which has been extended to recurrent models and use in optimal control problems. We explain how robust nonlinear estimation and sparse approximation can be done by means of this kernel based technique. A short overview of hyperparameter tuning methods is given. SVM methods are able to learn and generalize well in large dimensional input spaces and have outperformed many existing methods on benchmark data sets. Its full potential in a dynamical systems and control context remains to be explored.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.