Abstract

Brightness temperature spectra measured by the Special Sensor Microwave/Imager (SSM/I) flown onboard F8 and F14 satellites of the U.S. Defense Meteorological Satellite Program (DMSP) during the 1987–1988 and 1997–1998 winter periods are analyzed concurrently with the data from snow monitoring stations over the former Soviet Union. Extensive analysis reveals the existence of two anomalies in the microwave thermal radiation spectra of snow cover. It is shown that in the beginning of winter the SSM/I measurements at 19, 37, and 85 GHz generally follow a classical pattern; that is, the brightness temperatures decrease for both increasing snow depth and increasing frequency. Dramatic departures from this behavior is observed around the middle of winter: The brightness temperatures reach a minimum and then begin to increase despite the fact that the snow depth remains constant or even continues to grow. Statistical analysis of the snow pack characteristics and SSM/I measurements is presented around the time when the brightness temperatures reach a minimum. The anomalous spectral characteristics are analyzed using a two‐stream radiative transfer model and dense media theory. It is shown how metamorphic changes in the snow crystalline structure are responsible for the brightness temperature minimum. The second departure from the normal snow signature is the inversion of brightness temperature spectra; that is, the higher‐frequency brightness temperature is greater than the low‐frequency measurements. It is shown that this phenomenon, observed previously over Greenland and Antarctica, is much more extensive. Radiative transfer simulations were used to show that a dense layer of surface crust on top of old coarse‐grained snow can produce the invented brightness temperature spectrum.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call