The moderate complexity of low-density parity-check (LDPC) codes under iterative decoding is attributed to the sparseness of their parity-check matrices. It is therefore of interest to consider how sparse parity-check matrices of binary linear block codes can be a function of the gap between their achievable rates and the channel capacity. This issue was addressed by Sason and Urbanke, and it is revisited in this paper. The remarkable performance of LDPC codes under practical and suboptimal decoding algorithms motivates the assessment of the inherent loss in performance which is attributed to the structure of the code or ensemble under maximum-likelihood (ML) decoding, and the additional loss which is imposed by the suboptimality of the decoder. These issues are addressed by obtaining upper bounds on the achievable rates of binary linear block codes, and lower bounds on the asymptotic density of their parity-check matrices as a function of the gap between their achievable rates and the channel capacity; these bounds are valid under ML decoding, and hence, they are valid for any suboptimal decoding algorithm. The new bounds improve on previously reported results by Burshtein and by Sason and Urbanke, and they hold for the case where the transmission takes place over an arbitrary memoryless binary-input output-symmetric (MBIOS) channel. The significance of these information-theoretic bounds is in assessing the tradeoff between the asymptotic performance of LDPC codes and their decoding complexity (per iteration) under message-passing decoding. They are also helpful in studying the potential achievable rates of ensembles of LDPC codes under optimal decoding; by comparing these thresholds with those calculated by the density evolution technique, one obtains a measure for the asymptotic suboptimality of iterative decoding algorithms
Read full abstract