Abstract

The data-aware method of distributions (DAMD) is a low-dimensional data assimilation procedure to forecast the behavior of dynamical systems described by differential equations. The core of DAMD is the minimization of a distance between an observation and a prediction in distributional terms, with prior and posterior distributions constrained to a statistical manifold defined by the method of distributions (MD). We leverage the information-geometric properties of the statistical manifold to reduce predictive uncertainty via data assimilation. Specifically, we exploit the information-geometric structures induced by two discrepancy metrics, the Kullback-Leibler divergence and the Wasserstein distance, which explicitly yield natural gradient descent. The use of a deep neural network as a surrogate model for MD enables automatic differentiation, further accelerating optimization. The manifold's geometry is quantified without sampling, yielding an accurate approximation of the gradient descent direction. Our numerical experiments demonstrate that accounting for the manifold's geometry significantly reduces the computational cost of data assimilation by both facilitating the calculation of gradients and reducing the number of required iterations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.