Abstract

At RF/microwave frequencies below 10 GHz, the atmosphere is considered transparent except for cases of extreme rain events. Moreover, scattering effects from rain drops are generally negligible, and the calculations of extinction coefficients are primarily liquid water absorption. In convective rain events (thunderstorms), larger rain drop sizes would affect both, absorption and scattering coefficients found in rain extinction propagation models. Since 2010, the Central Florida Remote Sensing Laboratory has been conducting field measurements to empirically derive extinction coefficients in strong convective thunderstorms with a zenith pointing radiometer operating at 6.8 GHz. Experimentally, blackbody noise emission from rain (brightness temperature, T b ) is measured simultaneously with rain rate (mm/hr) using surface gauges and the surface rain drop size distribution from a disdrometer. This paper is a theoretical treatment of the signal processing procedures used to derive rain extinction coefficients. Rain gauges, disdrometer and microwave radiometer possess different sampling time events and to align the spatial/temporal domain with the dynamics of a thunderstorm will prove to be challenging. This paper presents several candidate signal analysis approaches through a radiative transfer model (RTM) by incorporating both, rain drop scattering and absorption. The rain gauge and disdrometer measurements are used as inputs into the RTM, to calculate an instantaneous downwelling radiance time series. The radiometer's measured Tbs and RTM theoretical Tbs time series are compared as the rain extinction coefficients are iteratively estimated in a Maximum Likelihood Estimation procedure with statistics to yield a solution. A simulated rain event is illustrated and analyzed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call