Abstract

This paper presents deep learning (DL) methods to optimize polar belief propagation (BP) decoding and concatenated LDPC-polar codes. First, two-dimensional offset Min-Sum (2-D OMS) decoding is proposed to improve the error-correction performance of existing normalized Min-Sum (NMS) decoding. Two optimization methods used in DL, namely back-propagation and stochastic gradient descent, are exploited to derive the parameters of proposed algorithms. Numerical results demonstrate that there is no performance gap between 2-D OMS and exact BP on various code lengths. Then the concatenated OMS algorithms with low complexity are presented for concatenated LDPC-polar codes. As a result, the optimized concatenated OMS decoding yields error-correction performance with CRC-aided successive cancellation list (CA-SCL) decoder of list size 2 on length-1024 polar codes. In addition, the efficient hardware architectures of scalable polar OMS decoder are described and the proposed decoder is reconfigurable to support three code lengths ( $N= 256, 512, 1024$ ) and two decoding algorithms (2-D OMS and concatenated OMS). The polar OMS decoder implemented on 65 nm CMOS technology achieves a maximum coded throughput of 5.4 Gb/s at $E_{b}/N_{0} = 4$ dB for code length 1024 and 7.5 Gb/s at $E_{b}/N_{0} = 3.5$ dB for code length 256, which are comparable to the state-of-the-art polar BP decoders. Moreover, a 5.1 Gb/s throughput at $E_{b}/N_{0} = 4$ dB is achieved under concatenated OMS decoding mode for code length 1024 with a latency of 200 ns, which is superior to existing CA-SCL decoders that have similar error-correction performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call