AbstractOcean model simulations show variability due to intrinsic chaos and external forcing (air‐sea fluxes, river input, etc.). It is important to estimate their contributions to total variability for attribution. Using variance to estimate variability might be unreliable due to non‐Gaussian higher statistical moments. We show the use of non‐parametric information theory metrics, Shannon entropy and mutual information, for measuring internal and forced variability in ocean models. These metrics are applied to spatially and temporally averaged data. The metrics delineate relative intrinsic to total variability in a wider range of circumstances than previous approaches based on variance ratios. The metrics are applied to (a) a synthetic ensemble of random vectors, (b) ocean component of a global climate (GFDL‐ESM2M) large ensemble, (c) ensemble of a realistic coastal ocean model. The information theory metric qualitatively agrees with the variance‐based metric and possibly identifies regions of nonlinear correlations. In application (2)–the climate ensemble–the information theory metric detects higher temperature intrinsic variability in the Arctic region compared to the variance metric illustrating that the former is robust in a skewed probability distribution (Arctic sea surface temperature) resulting from sharply nonlinear behavior (freezing point). In application (3)–coastal ensemble–variability is dominated by external forcing. Using different selective forcing ensembles, we quantify the sensitivity of the coastal model to different types of external forcing: variations in the river runoff and changes in wind product do not add information (i.e., variability) during summer. Information theory enables ranking how much each forcing type contributes across multiple variables.
Read full abstract