Abstract

Subspace identification methods for multivariable linear parameter-varying (LPV) and bilinear state-space systems perform computations with data matrices of which the number of rows grows exponentially with the order of the system. Even for relatively low-order systems with only a few inputs and outputs, the amount of memory required to store these data matrices exceeds the limits of what is currently available on the average desktop computer. This severely limits the applicability of the methods. In this paper, we present kernel methods for subspace identification performing computations with kernel matrices that have much smaller dimensions than the data matrices used in the original LPV and bilinear subspace identification methods. We also describe the integration of regularization in these kernel methods and show the relation with least-squares support vector machines. Regularization is an important tool to balance the bias and variance errors. We compare different regularization strategies in a simulation study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call