Abstract
Ensemble-variational methods form the basis of the state-of-the-art for nonlinear, scalable data assimilation, yet current designs may not be cost-effective for reducing prediction error in online, short-range forecast systems. We propose a novel, outer-loop optimization of the ensemble-variational formalism for applications in which forecast error dynamics are weakly nonlinear, such as synoptic meteorology. In order to rigorously derive our method and demonstrate its novelty, we review ensemble smoothers that appear throughout the literature in a unified Bayesian maximum-a-posteriori narrative, updating and simplifying some results. After mathematically deriving our technique, we systematically develop and inter-compare all studied schemes in the open-source Julia package DataAssimilationBenchmarks.jl, with pseudo-code provided for these methods. This high-performance numerical framework, supporting our mathematical results, produces extensive benchmarks that demonstrate the significant performance advantages of our proposed technique. In particular, our single-iteration ensemble Kalman smoother is shown both to improve prediction / posterior accuracy and to simultaneously reduce the leading order cost of iterative, sequential smoothers in a variety of relevant test cases for operational short-range forecasts. This long work is thus intended to present our novel single-iteration ensemble Kalman smoother, and to provide a theoretical and computational framework for the study of sequential, ensemble-variational Kalman filters and smoothers generally.
Highlights
Ensemble-variational methods form the basis of the state-of-the-art for nonlinear, scalable data assimilation (DA) (Asch et al, 2016; Bannister, 2017)
The single-iteration ensemble Kalman smoother (SIEnKS) demonstrates significantly improved smoother accuracy over the Lin-iterative ensemble Kalman smoother (IEnKS) while remaining at a lower leading order cost. This suggests that the sequential multiple data assimilation (MDA) scheme of the SIEnKS is better equipped to handle highly nonlinear observation operators than the 4D-maxiumum a posteriori (MAP) formalism, which appears to suffer from a greater number of local minima
We provide a detailed review of the state-of-the-art for sequential, ensemble-variational Kalman filters and smoothers in perfect models within the Bayesian MAP formalism of the IEnKS
Summary
Ensemble-variational methods form the basis of the state-of-the-art for nonlinear, scalable data assimilation (DA) (Asch et al, 2016; Bannister, 2017). When 40 the linear-Gaussian approximation for the forecast error dynamics is adequate, nonlinearity in the DA cycle may instead by dominated by nonlinearity in the observation operator, nonlinearity in hyper-parameter optimization, and / or nonlinearity in temporally interpolating a re-analyzed, smoothed solution over the DAW In this setting, our novel formulation of iterative, ensemble-variational smoothing has substantial advantages in balancing the computational cost / prediction accuracy trade off for these estimators
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.