Abstract

Least Squares Support Vector Machine (LS-SVM) is a computationally efficient kernel-based regression approach which has been recently applied to nonparametric identification of Linear Parameter Varying (LPV) systems. In contrast to parametric LPV identification approaches, LS-SVM based methods obviate the need to parameterize the scheduling dependence of the LPV model coefficients in terms of a-priori specified basis functions. However, an accurate selection of the underlying model order (in terms of number of input lags, output lags and input delay) is still a critical issue in the identification of LPV systems in the LS-SVM setting. In this paper, we address this issue by extending the LS-SVM method to sparse LPV model identification, which, besides non-parametric estimation of the model coefficients, achieves datadriven model order selection via convex optimization. The main idea of the proposed method is to first estimate the coefficients of an over-parameterized LPV model through LS-SVM. The estimated coefficients are then scaled by polynomial weights, which are shrunk towards zero to enforce sparsity in the final LPV model estimate. The properties of the proposed approach are illustrated via simulation examples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call