Abstract

Although kernel methods have been successfully applied to many different problems in system identification, choosing an optimal kernel structure can be challenging - particularly in higher-order problems. However by noting that structural information, such as linearity, separability and smoothness, is contained in the functional derivatives, it can be seen that the kernel selection problem can be reduced to a simpler regularisation problem over specified derivatives. In this vein, here a novel approach to the control of smoothness in nonparametric LPV identification is proposed. By constraining the derivatives of the scheduling dependencies through a regularisation term, the model smoothness can be linearly controlled through the regularisation hyper parameter, without needing to optimise over the kernel function. A simulation example is presented to show how different structural hypotheses can be tested at minimal extra cost to the user, with the proposed approaches validated against a method from the literature.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.