Abstract
In this paper, we propose an innovative channel coding scheme called accumulate-repeat-accumulate (ARA) codes. This class of codes can be viewed as serial turbo-like codes or as a subclass of low-density parity check (LDPC) codes, and they have a projected graph or protograph representation; this allows for high-speed iterative decoding implementation using belief propagation. An ARA code can be viewed as precoded repeat accumulate (RA) code with puncturing or as precoded irregular repeat accumulate (IRA) code, where simply an accumulator is chosen as the precoder. The amount of performance improvement due to the precoder will be called precoding gain. Using density evolution on their associated protographs, we find some rate-1/2 ARA codes, with a maximum variable node degree of 5 for which a minimum bit SNR as low as 0.08 dB from channel capacity threshold is achieved as the block size goes to infinity. Such a low threshold cannot be achieved by RA, IRA, or unstructured irregular LDPC codes with the same constraint on the maximum variable node degree. Furthermore, by puncturing the inner accumulator, we can construct families of higher rate ARA codes with thresholds that stay close to their respective channel capacity thresholds uniformly. Iterative decoding simulation results are provided and compared with turbo codes. In addition to iterative decoding analysis, we analyzed the performance of ARA codes with maximum-likelihood (ML) decoding. By obtaining the weight distribution of these codes and through existing tightest bounds we have shown that the ML SNR threshold of ARA codes also approaches very closely to that of random codes. These codes have better interleaving gain than turbo codes
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.