Abstract

In this paper, we propose a method to analyze the performance of ultra-reliable low-latency communication (URLLC) with packet-level coding when no feedback from a receiver is available. Unlike conventional methods of analyzing URLLC performance using <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">average</i> error rate, we focus on the events of burst or clustered (reception) errors <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sup> as they can be fatal for certain real-time wireless control systems. In the proposed method, to see the impact of clustered errors on the performance, a virtual queue is considered, where the reception error sequence is regarded as the arrival process and the departure process is characterized by the target error rate. This virtual queue allows to find how often the events of clustered errors of certain sizes occur using large deviations theory. For packet-level channels modeled by an independent and identically distributed process and a two-state Markov chain, the quality-of-service exponents are derived, and the asymptotic probability of buffer overflow is obtained, which agrees with simulation results. This demonstrates that the proposed method using virtual queue is useful to find the probability of system failure due to clustered errors and allows to determine the values of key parameters of URLLC for a certain probability of system failure under given conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call