Abstract

Minimum duration outages have been used to characterize time-dependent performance in analysis of quantities such as average duration of an outage, the probability of outage, and the frequency of outage, in lognormal shadow fading (Mandayam et al., 1996) and Rayleigh fading (Lai and Mandayam, 1997), respectively. In this paper, we compare and contrast the effect of minimum duration outages on fade margin selection in channels subject to lognormal shadow fading and Rayleigh fading. A comparative analysis of relevant minimum durations for lognormal shadow fading and Rayleigh fading reveals the widely different time-scales active in outage considerations for each type of fading. Further, it is observed that lognormal shadow fading impacts outage on a time-scale that is much larger than that due to Rayleigh fading. The distinct time-scales revealed by the analysis show that the time-scales determined by the application govern the relative importance of the type of fading.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call