Abstract
Extensive studies have demonstrated the effectiveness of constructing capacity-approaching codes by block Markov superposition transmission (BMST). However, to achieve high performance, BMST codes typically require large encoding memories and large decoding window sizes, which result in increased decoding complexity and decoding latency. To address this issue, we introduce the recursive BMST (rBMST), in which block-oriented feedback convolutional code is used instead of the block-oriented feedforward convolutional code. We propose to use a modified extrinsic information transfer (EXIT) chart analysis to study the convergence behavior of rBMST codes. On one hand, rBMST code shares most merits of BMST code, including near-capacity performance, low-complexity encoding, and flexible construction. On the other hand, compared with BMST code, rBMST code requires a smaller encoding memory, hence a lower decoding complexity, to approach the capacity. In particular, analytical results show that, rBMST code ensemble with encoding memory three reveals a lower error-floor than the BMST code ensemble with encoding memory twelve.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.