Abstract

Changes in the circumstances behind in situ temperature measurements often lead to biases in individual station records that, collectively, can also bias regional temperature trends. Since these biases are comparable in magnitude to climate change signals, homogeneity “corrections” are necessary to make the records suitable for climate analysis. To quantify the effectiveness of U.S. surface temperature homogenization, a randomized perturbed ensemble of the USHCN pairwise homogenization algorithm was run against a suite of benchmark analogs to real monthly temperature data. Results indicate that all randomized versions of the algorithm consistently produce homogenized data closer to the true climate signal in the presence of widespread systematic errors. When applied to the real‐world observations, the randomized ensemble reinforces previous understanding that the two dominant sources of bias in the U.S. temperature records are caused by changes to time of observation (spurious cooling in minimum and maximum) and conversion to electronic resistance thermometers (spurious cooling in maximum and warming in minimum). Error bounds defined by the ensemble output indicate that maximum temperature trends are positive for the past 30, 50 and 100 years, and that these maximums contain pervasive negative biases that cause the unhomogenized (raw) trends to fall below the lower limits of uncertainty. Moreover, because residual bias in the homogenized analogs is one‐tailed under biased errors, it is likely that maximum temperature trends have been underestimated in the USHCN. Trends for minimum temperature are also positive over the three periods, but the ensemble error bounds encompass trends from the unhomogenized data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call