Abstract

The information bottleneck method is a generic clustering framework from the field of machine learning which allows compressing an observed quantity while retaining as much of the mutual information it shares with the quantity of primary relevance as possible. The framework was recently used to design message-passing decoders for low-density parity-check codes in which all the arithmetic operations on log-likelihood ratios are replaced by table lookups of unsigned integers. This paper presents, in detail, the application of the information bottleneck method to polar codes, where the framework is used to compress the virtual bit channels defined in the code structure and show that the benefits are twofold. On the one hand, the compression restricts the output alphabet of the bit channels to a manageable size. This facilitates computing the capacities of the bit channels in order to identify the ones with larger capacities. On the other hand, the intermediate steps of the compression process can be used to replace the log-likelihood ratio computations in the decoder with table lookups of unsigned integers. Hence, a single procedure produces a polar encoder as well as its tailored, quantized decoder. Moreover, we also use a technique called message alignment to reduce the space complexity of the quantized decoder obtained using the information bottleneck framework.

Highlights

  • In a typical transmission chain, forward error correction is a resource-hungry and computationally complex component

  • This paper presented the application of the information bottleneck method for construction and coarsely quantized decoding of polar codes

  • The information bottleneck framework was used in the discrete density evolution of these virtual bit channels in order to restrict their output alphabet to a small finite size

Read more

Summary

Introduction

In a typical transmission chain, forward error correction is a resource-hungry and computationally complex component. Unlike other construction methods, the benefits of this approach are twofold: Firstly, the discrete density evolution using the information bottleneck method compresses the output alphabet of the bit channels of a polar code to a small size, while minimizing the information loss due to the compression as much as possible This compression facilitates computing the reliabilities of the bit channels in terms of their capacities, keeping track of the loss of the mutual information caused by the quantization [25]. The intermediate steps of the discrete density evolution provide deterministic mappings between the quantized input and output messages for each node in the Tanner graph of the polar code These deterministic mappings can replace the LLR computations of a conventional decoder, constituting an information bottleneck.

Polar Codes
Successive Cancellation List Decoder
Polar Code Construction
Information Bottleneck Method
Polar Code Construction Using the Information Bottleneck Method
Information Bottleneck Construction
Tal and Vardy’s Construction
Information Bottleneck Polar Decoders
Lookup Tables for Decoding on a Building Block
Information Bottleneck Successive Cancellation List Decoder
Space-Efficient Information Bottleneck Successive Cancellation List Decoder
The Role of Translation Tables
Message Alignment for Successive Cancellation List Decoder
Code Construction
Information Bottleneck Decoders
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.