Abstract

AbstractData Assimilation aims at estimating the posterior conditional probability density functions based on error statistics of the noisy observations and the dynamical system. State of the art methods are sub‐optimal due to the common use of Gaussian error statistics and the linearization of the non‐linear dynamics. To achieve a good performance, these methods often require case‐by‐case fine‐tuning by using explicit regularization techniques such as inflation and localization. In this paper, we propose a fully data driven deep learning framework generalizing recurrent Elman networks and data assimilation algorithms. Our approach approximates a sequence of prior and posterior densities conditioned on noisy observations using a log‐likelihood cost function. By construction our approach can then be used for general nonlinear dynamics and non‐Gaussian densities. As a first step, we evaluate the performance of the proposed approach by using fully and partially observed Lorenz‐95 system in which the outputs of the recurrent network are fitted to Gaussian densities. We numerically show that our approach, without using any explicit regularization technique, achieves comparable performance to the state‐of‐the‐art methods, IEnKF‐Q and LETKF, across various ensemble size.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call