Kernel-based regularization approaches have been successfully applied in the last years for regression purposes. Recently, these machine learning techniques have been also introduced in linear system identification, by interpreting impulse response estimation as a function learning problem. The adopted estimator solves a regularized least squares problem which admits also a Bayesian interpretation where the impulse response is modeled as a zero-mean Gaussian vector. A possible choice for the covariance is the so called stable spline kernel. It includes information on smoothness and exponential stability, containing just two unknown parameters which can be tuned via marginal likelihood (ML) optimization. Experimental evidence has shown that this new approach may outperform traditional system identification approaches, such as PEM and subspace techniques.The aim of this work is to provide new insights on the stable spline estimator equipped with ML estimation of hyperparameters. To this purpose, we study the mean squared error properties of the ML procedure for hyperparameter estimation; in doing so we shall not assume the correctness of the Bayesian priors. Then, we derive the notion of excess degrees of freedom. This notion measures the additional complexity to be assigned to an estimator which is also required to determine hyperparameters from data. The conclusion of our investigation is that much of criticisms reported in the literature to robustness of ML is not well founded. On the contrary, in many situations ML can well balance data fit and excess degrees of freedom. Hence, it turns out an important tool for tuning model complexity in linear system identification also when undermodeling affects the kernel-based description of the impulse response.
Read full abstract