Abstract
In this paper, we study the queue-overflow probability of wireless scheduling algorithms. In wireless networks operated under queue-length-based scheduling algorithms, there often exists a tight coupling between the service-rate process, the system backlog process, the arrival process, and the stochastic process governing channel variations. Although one can use sample-path large-deviation techniques to form an estimate of the queue-overflow probability, the formulation leads to a difficult multidimensional calculus-of-variations problem. In this paper, we present a new technique to address this complexity issue. Using ideas from the Lyapunov function approach in control theory, this technique maps the complex multidimensional calculus-of-variations problem to a 1-D calculus-of-variations problem, and the latter is often much easier to solve. Further, under appropriate conditions, we show that when a scheduling algorithm minimizes the drift of a Lyapunov function at each point of every fluid sample path, the algorithm will be optimal in the sense that it maximizes the asymptotic decay rate of the probability that the Lyapunov function value exceeds a given threshold. We believe that these results can potentially be used to study the queue-overflow probability of a large class of wireless scheduling algorithms and to design new scheduling algorithms with optimal overflow probabilities.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.