Abstract

AbstractUsing optimal detection techniques with climate model simulations, most of the observed increase of near‐surface temperatures over the second half of the twentieth century is attributed to anthropogenic influences. However, the partitioning of the anthropogenic influence to individual factors, such as greenhouse gases and aerosols, is much less robust. Differences in how forcing factors are applied, in their radiative influence and in models' climate sensitivities, substantially influence the response patterns. We find that standard optimal detection methodologies cannot fully reconcile this response diversity. By selecting a set of experiments to enable the diagnosing of greenhouse gases and the combined influence of other anthropogenic and natural factors, we find robust detections of well‐mixed greenhouse gases across a large ensemble of models. Of the observed warming over the twentieth century of 0.65 K/century we find, using a multimodel mean not incorporating pattern uncertainty, a well‐mixed greenhouse gas warming of 0.87 to 1.22 K/century. This is partially offset by cooling from other anthropogenic and natural influences of −0.54 to −0.22 K/century. Although better constrained than recent studies, the attributable trends across climate models are still wide, with implications for observational constrained estimates of transient climate response. Some of the uncertainties could be reduced in future by having more model data to better quantify the simulated estimates of the signals and natural variability, by designing model experiments more effectively and better quantification of the climate model radiative influences. Most importantly, how model pattern uncertainties are incorporated into the optimal detection methodology should be improved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call