Abstract

Abstract The development of machine learning (ML) techniques enables data-driven parameterizations, which have been investigated in many recent studies. Some investigations suggest that a priori-trained ML models exhibit satisfying accuracy during training but poor performance when coupled to dynamical cores and tested. Here we use the evolution of the barotropic vorticity equation (BVE) with periodically reinforced shear instability as a prototype problem to develop and evaluate a model-consistent training strategy, which employs a numerical solver supporting automatic differentiation and includes the solver in the loss function for training ML-based subgrid-scale (SGS) turbulence models. This approach enables the interaction between the dynamical core and the ML-based parameterization during the model training phase. The BVE model was run at low, high, and ultrahigh (truth) resolutions. Our training dataset contains only a short period of coarsened high-resolution simulations. However, given initial conditions long after the training dataset time, the trained SGS model can still significantly increase the effective lead time of the BVE model running at the low resolution by up to 50% relative to the BVE simulation without an SGS model. We also tested using a covariance matrix to normalize the loss function and found it can notably boost the performance of the ML parameterization. The SGS model’s performance is further improved by conducting transfer learning using a limited number of discontinuous observations, increasing the forecast lead-time improvement to 73%. This study demonstrates a potential pathway to using machine learning to enhance the prediction skills of our climate and weather models. Significance Statement Numerical weather prediction is performed at limited resolution for computational feasibility, and the schemes to estimate unresolved processes are called parameterization. We propose a strategy to develop better deep learning–based parameterization in which an automatic differentiable numerical solver is employed as the dynamic core and interacts with the parameterization scheme during its training. Such a numerical solver enables consistent deep learning, because the parameterization is trained with a direct target of making the numerical model (dynamic core and parameterization together) forecast match observations as much as possible. We demonstrate the feasibility and effectiveness of such a strategy using a surrogate model and advocate that such machine learning–enabled numerical models provide a promising pathway to developing next-generation weather forecast and climate models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call