Abstract

This paper proposes a method for reduction of scheduling dependency in linear parameter-varying (LPV) systems. In particular, both the dimension of the scheduling variable and the corresponding scheduling region are shrunk using kernel-based principal component analysis (PCA). Kernel PCA corresponds to linear PCA that is performed in a high-dimensional feature space, allowing the extension of linear PCA to nonlinear dimensionality reduction. Hence, it enables the reduction of complicated coefficient dependencies which cannot be simplified in a linear subspace, giving kernel PCA an advantage over other linear techniques. This corresponds to mapping the original scheduling variables to a set of lower dimensional variables via a nonlinear mapping. However, to recover the original coefficient functions of the model, this nonlinear mapping is needed to be inverted. Such an inversion is not straightforward. The reduced scheduling variables are a nonlinear expansion of the original scheduling variables into a high-dimensional feature space, an inverse mapping for which is not available. Therefore, we cannot generally assert that such an expansion has a “pre-image” in the original scheduling region. While certain pre-image approximation algorithms are found in the literature for Gaussian kernel-based PCA, we aim to generalize the pre-image estimation algorithm to other commonly used kernels, and formulate an iterative pre-image estimation rule. Finally, we consider the case study of a physical system described by an LPV model and compare the performance of linear and kernel PCA-based LPV model reduction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call