One way to understand time-series data is to identify the underlying dynamical system which generates it. This task can be done by selecting an appropriate model and a set of parameters which best fits the dynamics while providing the simplest representation (i.e. the smallest amount of terms). One such approach is the sparse identification of nonlinear dynamics framework [6] which uses a sparsity-promoting algorithm that iterates between a partial least-squares fit and a thresholding (sparsity-promoting) step. In this work, we provide some theoretical results on the behavior and convergence of the algorithm proposed in [6]. In particular, we prove that the algorithm approximates local minimizers of an unconstrained $\ell^0$-penalized least-squares problem. From this, we provide sufficient conditions for general convergence, rate of convergence, and conditions for one-step recovery. Examples illustrate that the rates of convergence are sharp. In addition, our results extend to other algorithms related to the algorithm in [6], and provide theoretical verification to several observed phenomena.
Read full abstract