Abstract

Function estimation using the Reproducing Kernel Hilbert Space (RKHS) framework is a powerful tool for identification of a general class of nonlinear dynamical systems without requiring much a priori information on model orders and nonlinearities involved. However, the high degrees-of-freedom (DOFs) of RKHS estimators has its price, as in case of large scale function estimation problems, they often require a serious amount of data samples to explore the search space adequately for providing high-performance model estimates. In cases where nonlinear dynamic relations can be expressed as a sum of functions, the literature proposes solutions to this issue by enforcing sparsity for adequate restriction of the DOFs of the estimator, resulting in parsimonious model estimates. Unfortunately, all existing solutions are based on greedy approaches, leading to optimization schemes which cannot guarantee convergence to the global optimum. In this paper, we propose an ℓ1-regularized non-parametric RKHS estimator which is the solution of a quadratic optimization problem. Effectiveness of the scheme is demonstrated on the non-parametric identification problem of LPV-IO models where the method solves simultaneously (i) the model order selection problem (in terms of number of input–output lags and input delay in the model structure) and (ii) determining the unknown functional dependency of the model coefficients on the scheduling variable directly from data. The paper also provides an extensive simulation study to illustrate effectiveness of the proposed scheme.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call