Abstract

Recent studies have shown how regularization may play an important role in linear system identification. An effective approach consists of searching for the impulse response in a high-dimensional space, e.g., a reproducing kernel Hilbert space (RKHS). Complexity is then controlled using a regularizer, e.g., the RKHS norm, able to encode smoothness and stability information. Examples are RKHSs induced by the so-called stable spline or tuned-correlated kernels, which contain a parameter that regulates impulse response exponential decay. In this article, we derive nonasymptotic upper bounds on the <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$\ell _2$</tex-math></inline-formula> error of these regularized schemes and study their optimality in order (in the minimax sense). Under white noise inputs and Gaussian measurement noises, we obtain conditions which ensure the optimal convergence rate for all the class of stable spline estimators and several generalizations. Theoretical findings are then illustrated via a numerical experiment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call