Abstract

The nature of the water vapour continuum has been of great scientific interest for more than 60 years. Here, water vapour self-continuum absorption spectra are retrieved at temperatures of 398 K and 431 K and at vapour pressures from 1000 to 4155 mbar in the 8800 and 10,600 cm−1 absorption bands using high-resolution FTS measurements. For the observed conditions, the MT_CKD-3.2 model underestimates the observed continuum on average by 1.5–2 times. We use the hypothesis that water dimers contribute to the continuum absorption to simulate the experimentally-retrieved self-continuum absorption spectra, and to explain their characteristic temperature dependence and spectral behaviour. The values of the effective equilibrium constant are derived for the observed temperatures. We find that the dimer-based model fits well to the measured self-continuum from this and previous studies, but requires a higher effective equilibrium constant compared to the modern estimates within the temperature range (268–431 K) and spectral region studied. It is shown that water dimers are likely responsible for up to 50% of the observed continuum within these bands. Possible causes of the incomplete explanation of the continuum are discussed. Extrapolating these measurements to atmospheric temperatures using the dimer-based model, we find that the newly-derived self-continuum reduces calculated surface irradiances by 0.016 W m−2 more than the MT_CKD-3.2 self-continuum in the 8800 cm−1 band for overhead-Sun mid-latitude summer conditions, corresponding to a 12.5% enhancement of the self-continuum radiative effect. The change integrated across the 10,600 cm−1 band is about 1%, but with significant differences spectrally.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call