Abstract

Abstract. The analysis in nonlinear variational data assimilation is the solution of a non-quadratic minimization. Thus, the analysis efficiency relies on its ability to locate a global minimum of the cost function. If this minimization uses a Gauss–Newton (GN) method, it is critical for the starting point to be in the attraction basin of a global minimum. Otherwise the method may converge to a local extremum, which degrades the analysis. With chaotic models, the number of local extrema often increases with the temporal extent of the data assimilation window, making the former condition harder to satisfy. This is unfortunate because the assimilation performance also increases with this temporal extent. However, a quasi-static (QS) minimization may overcome these local extrema. It accomplishes this by gradually injecting the observations in the cost function. This method was introduced by Pires et al. (1996) in a 4D-Var context. We generalize this approach to four-dimensional strong-constraint nonlinear ensemble variational (EnVar) methods, which are based on both a nonlinear variational analysis and the propagation of dynamical error statistics via an ensemble. This forces one to consider the cost function minimizations in the broader context of cycled data assimilation algorithms. We adapt this QS approach to the iterative ensemble Kalman smoother (IEnKS), an exemplar of nonlinear deterministic four-dimensional EnVar methods. Using low-order models, we quantify the positive impact of the QS approach on the IEnKS, especially for long data assimilation windows. We also examine the computational cost of QS implementations and suggest cheaper algorithms.

Highlights

  • Data assimilation (DA) aims at gathering knowledge about the state of a system from acquired observations

  • After reviewing 4D-Var and the iterative ensemble Kalman smoother (IEnKS) algorithms, we investigate the dependency of assimilation performance on the data assimilation window (DAW) key parameters

  • We found that the DAW parameter L improves the smoothing expected MSE (eMSE) and S improves the filtering eMSE

Read more

Summary

Context

Data assimilation (DA) aims at gathering knowledge about the state of a system from acquired observations. This posterior pdf is propagated in time with the model to yield the prior pdf of the assimilation cycle. A model propagation is sufficient to estimate the prior pdf mean This global approach calls for efficient global optimization routines. The Gauss–Newton method’s ability to locate the global minimum depends on the minimization starting point and the cost function properties Missing this global minimum is likely to cause a quick divergence (from the truth) of the sequential DA method. It is critical for the assimilation algorithm to keep the minimization starting point in a global minimum basin of attraction

Quasi-static variational data assimilation
Ensemble variational methods
Outline
The data assimilation window and assimilation performance
Performance of assimilation
Multiple local minima
Effective data assimilation window length
Quasi-static algorithms
Numerical experiments with low-order models
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call