Abstract

The estimation of rain attenuation over a satellite link needs an accurate rainfall rate. The exponential development of satellite networks using higher-frequency bands such as Ka bands has highlighted the need to assess the combined effect of multiple diffusion impairments. The network of satellite communication links operating at Ku-band and above experiences rain fades due to signal absorption and dispersion. When considering link budget planning, the tropical and subtropical regions are of concern due to the high amount of precipitation when compared with the temperate regions. This paper examined the performance of the time-series ARIMA model on a Ka-band terrestrial link in Durban South Africa. The performance and validity are tested with the received signal level data measurements over a 6.73 km terrestrial LOS link centred at 19.5 GHz, the synthetic storm technique, and the International Telecommunication Union Recommendation model (ITU-R) based on rain attenuation generated from rain rate data over nine (9) years (20052013). The results reveal that the ITU-R model did not correspond with measured results. Hence, we tested a supervised learning-time series-based attenuation prediction method, which provides better performance than the existing models. Furthermore, the comparison with experimental results also shows that the proposed method has advantages of real-time forecast and high availability. The information from the present study will further provide quantitative insights on time-series rain fade needed in planning for 5G networks and beyond in the subtropical regions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call