Extensive studies have demonstrated the effectiveness and the flexibility of constructing capacity-approaching codes by block Markov superposition transmission (BMST). However, to achieve high performance, BMST codes typically require large encoding memories and large decoding window sizes, which result in high decoding complexity and high decoding latency. To address these issues, we introduce the recursive BMST (rBMST), in which the block-oriented feedback convolutional code is used instead of the block-oriented feedforward convolutional code of BMST. We propose to use a modified extrinsic information transfer chart analysis, which relates the mutual information to the bit error rate, to study the convergence behaviors of rBMST codes. On one hand, rBMST code shares most merits of BMST code, including near-capacity performance, low-complexity encoding, and flexible construction. On the other hand, compared with BMST code, rBMST code requires a smaller encoding memory, hence a lower decoding complexity, to approach the capacity. In particular, both analytical and simulation results show that rBMST code with encoding memory three reveals a lower error floor than the BMST code with encoding memory twelve. Furthermore, we show by analysis and simulations that rBMST with fixed encoding memory ( $m = 3$ ) and fixed decoding delay ( $d = 12$ ) can be used to construct capacity-approaching multiple-rate codes. Finally, the comparison between rBMST codes and spatially coupled low-density parity-check codes is carried out, which shows the advantages of rBMST codes in terms of performances and decoding complexities.
Read full abstract