Abstract

Rainfall measurements have been investigated worldwide because of their important implications in meteorology, hydrology, flood warnings and fresh water resource management. Recently a new way of measuring rainfall based on commercial microwave radio links that form cellular communication networks has been proposed, in which the path-integrated rainfall intensity is estimated from the received signal level. This method can reveal fine-scale evolution of rainfall in space and time and allows observation of near-surface rainfall at a spatial resolution of up to 1×1km2 and a temporal resolution of up to 1min, with no additional installation and maintenance costs. Here we examine two different methodologies for calculating instantaneous rainfall from microwave links. The study region covers a 1600km2 area in central Israel which includes up to 70 commercial microwave links, and 7 rain gauges installed in the vicinity of the links. 19 rainstorm events over a two year period covering about a 676h overall period, are evaluated. The first methodology uses data from the nearest microwave link while the second uses data interpolated at the point of the rain gauge from multiple nearby links. Results are compared to results from nearby rain gauges. At temporal resolutions of 1min correlations of 0.65 and .77, with biases of −0.08 and −0.06mmh−1 were attained for the first and second methods, respectively. At the temporal resolution of 10min, the correlations of 0.84 and 0.85, with biases of −0.11 and −0.06mmh−1 were attained. On average, application of the interpolation point methodology underestimated accumulated rainfall by only 3% as compared to nearby rain gauges. The single link method overestimated rainfall by 6%. Both methodologies improved (worsened) as the density of the microwave link grid increased (decreased).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call