Abstract

Abstract Objective data assimilation methods such as variational and ensemble algorithms are attractive from a theoretical standpoint. Empirical nudging approaches are computationally efficient and can get around some amount of model error by using arbitrarily large nudging coefficients. In an attempt to take advantage of the strengths of both methods for analyses, combined nudging-ensemble approaches have been recently proposed. Here the two-scale Lorenz model is used to elucidate how the forecast error from nudging, ensemble, and nudging-ensemble schemes varies with model error. As expected, an ensemble filter and smoother are closest to optimal when model errors are small or absent. Model error is introduced by varying model forcing, coupling between scales, and spatial filtering. Nudging approaches perform relatively better with increased model error; use of poor ensemble covariance estimates when model error is large harms the nudging-ensemble performance. Consequently, nudging-ensemble methods always produce error levels between the objective ensemble filters and empirical nudging, and can never provide analyses or short-range forecasts with lower errors than both. As long as the nudged state and the ensemble-filter state are close enough, the ensemble statistics are useful for the nudging, and fully coupling the ensemble and nudging by centering the ensemble on the nudged state is not necessary. An ensemble smoother produces the overall smallest errors except for with very large model errors. Results are qualitatively independent of tuning parameters such as covariance inflation and localization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call