Abstract

Kernel analog forecasting (KAF), alternatively known as kernel principal component regression, is a kernel method used for nonparametric statistical forecasting of dynamically generated time series data. This paper synthesizes descriptions of kernel methods and Koopman operator theory in order to provide a single consistent account of KAF. The framework presented here illuminates the property of the KAF method that, under measure-preserving and ergodic dynamics, it consistently approximates the conditional expectation of observables that are acted upon by the Koopman operator of the dynamical system and are conditioned on the observed data at forecast initialization. More precisely, KAF yields optimal predictions, in the sense of minimal root mean square error with respect to the invariant measure, in the asymptotic limit of large data. The presented framework facilitates, moreover, the analysis of generalization error and quantification of uncertainty. Extensions of KAF to the construction of conditional variance and conditional probability functions, as well as to non-symmetric kernels, are also shown. Illustrations of various aspects of KAF are provided with applications to simple examples, namely a periodic flow on the circle and the chaotic Lorenz 63 system.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.