While many communications media, such as wireless and certain classes of wireline channels, typically lead to bursty errors, most decoders are designed assuming memoryless channels. Consequently, communication systems generally rely on interleaving over tens of thousands of bits to match decoder assumptions. Even for short high rate codes, awaiting sufficient data in interleaving and de-interleaving is a significant source of unwanted latency. We construct an extension to the recently proposed Guessing Random Additive Noise Decoding (GRAND) algorithm, which we call GRAND-MO for GRAND Markov Order. By foregoing interleaving and instead making use of the bursty nature of noise, low-latency communication is possible with block error rates outperforming their interleaved counterparts by a substantial margin. We establish that certain well-known binary codes with structured code-word patterns are ill-suited for use in bursty channels, but Random Linear Codes (RLCs) prove robust to correlated noise. We further demonstrate that by operating directly on modulated symbols rather than de-mapped bits, GRAND-MO achieves further performance and complexity gains by exploiting information that is lost in demodulation. As a result, GRAND-MO provides one potential solution for applications that require ultra-reliable low latency communication.
Read full abstract