Abstract

Data assimilation algorithms combine a numerical model with observations in a quantitative way. For an optimal combination either variational minimization algorithms or ensemble-based estimation methods are applied. The computations of a data assimilation application are usually far more costly than a pure model integration. To cope with the large computational costs, a good scalability of the assimilation program is required. The ensemble-based methods have been shown to exhibit a particularly good scalability due to the natural parallelism inherent in the integration of an ensemble of model states. However, also the scalability of the estimation method – commonly based on the Kalman filter – is important. This study discusses implementation strategies for ensemble-based filter algorithms. Particularly efficient is a strong coupling between the model and the assimilation algorithm into a single executable program. The coupling can be performed with minimal changes to the numerical model itself and leads to a model with data assimilation extension. The scalability of the data assimilation system is examined using the example of an implementation of an ocean circulation model with the parallel data assimilation framework (PDAF) into which synthetic sea surface height data are assimilated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call