Abstract

Satellite telecommunication systems that make use of frequencies higher than 10 GHz can experience strong attenuation due to rain. One of the countermeasures that can be adopted is time diversity. In this paper the performance of the time diversity technique is investigated through both radar simulation and modeling. We first exploit an extensive database of radar maps of precipitation collected in northern Italy to generate synthetic time series of attenuation; the performance of the time diversity technique is evaluated for different frequencies and elevation angles. The same analysis is then performed using the ExCell attenuation prediction model, whose input parameters are elevation, frequency and polarization of the Earth terminal, and the “effective” rainfall rate. The model‐predicted diversity gain is found to agree well with that obtained through radar simulation, demonstrating that the performance of the time diversity technique can be evaluated from the time series of rain rate, acquired by a rain gauge.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call