Abstract

Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short-term climate and weather prediction, examples of these issues might involve a lack of information in the historical climate record compared with an ensemble prediction, or a lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify this information. Here a recently developed mathematical theory for quantifying this lack of information is converted into a practical algorithmic tool. The theory involves explicit estimators obtained through convex optimization, principal predictability components, a signal/dispersion decomposition, etc. An explicit computationally feasible family of estimators is developed here for estimating the relative entropy over a large dimensional family of variables through a simple hierarchical strategy. Many facets of this computational strategy for estimating uncertainty are applied here for ensemble predictions for two ``toy'' climate models developed recently: the Galerkin truncation of the Burgers--Hopf equation and the Lorenz '96 model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call