Abstract

To predict the Quality of Service at a node in heterogeneous networks of line‐of‐sight, terrestrial, microwave links requires knowledge of the spatial and temporal statistics of rain over scales of a few meters to tens or hundreds of kilometers, and over temporal periods as short as 1 s. Meteorological radar databases provide rain rate maps over areas with a spatial resolution as fine as a few hundred meters and a sampling period of 2 to 15 min. Such two‐dimensional, rain rate map time series would have wide application in the simulation of rain scatter and attenuation of arbitrary millimeter‐wave radio networks, if the sampling period were considerably shorter, i.e., of the order of 10 s or less, and the integration volumes smaller. This paper investigates a stochastic‐numerical method to interpolate and downscale rain rate field time series to shorter sampling periods and smaller spatial integration areas, while conserving the measured and expected statistics. A series of radar derived rain maps, with a 10 min sample period, are interpolated to 10 s. The statistics of the interpolated‐downscaled data are compared to fine‐scale rain data, i.e., 10 s rain gauge data and radar data with a 300‐m resolution. The interpolated rain map time series is used to predict the fade duration statistics of a microwave link, and these are compared to a published and ITU‐R model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call