Abstract

We propose a general framework for identification of linear discrete-time hybrid systems in which arbitrary loss functions can be easily included. Our framework includes the algebraic (Vidal et al., 2003) and support vector regression (Lauer and Bloch, 2008a,b) methods as particular cases. Inspired by these approaches, we then propose an optimization framework that relies on the minimization of a product of loss functions. Here, the identification problem is recast as a nonlinear and non-convex, though continuous, optimization program that involves only the model parameters as variables. As a result, its complexity scales linearly with the number of data and it can easily be solved using standard global optimization methods. Moreover, we show that by choosing a saturated loss function, such as Hampel's loss function, the algorithm can efficiently deal with noise and outliers in the data. The final result is a general framework for linear hybrid system identification that can deal efficiently with noise, outliers, and large data sets. Numerical experiments demonstrate the efficiency and robustness of the proposed approach.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call