Abstract

We propose a model that describes the signal fading process due to scintillation in the presence of rain. We analyzed a data set of uplink (30 GHz) and downlink (20 GHz) attenuation values averaged over 1 s intervals. The data are samples relative to ten significant events, for a total of 180 000 s recorded at the Spino d'Adda (North of Italy) station using the Olympus satellite. Our analysis is based on the fact that the plot of attenuation versus time recalls the behavior of a self-similar process. We then make various considerations, and propose, a fractional Brownian motion model for the scintillation process. We describe the model in detail, with pictures showing the apparent self-similarity of the measured data. We then show that the Hurst parameter of the process is a simple function of the rain fade. We describe a method for producing random data that interpolate the measured samples, while preserving some of their interesting statistical properties. This method can be used for simulating fade countermeasure systems. As a possible application of the model, we show how to optimize fade measurement times for fade countermeasure systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.