Abstract

This paper introduces two effective techniques to reduce the decoding complexity of turbo product codes (TPC) that use extended Hamming codes as component codes. We first propose an advanced hard-input soft-output (HISO) decoding algorithm, which is applicable if an estimated syndrome stands for double-error. In conventional soft-input soft-output (SISO) decoding algorithms, 2p (p: the number of least reliable bits) number of hard decision decoding (HDD) operations are performed to correct errors. However, only a single HDD is required in the proposed algorithm. Therefore, it is able to lower the decoding complexity. In addition, we propose an early termination technique for undecodable blocks. The proposed early termination is based on the difference in the ratios of double-error syndrome detection between two consecutive half-iterations. Through this early termination, the average iteration number is effectively lowered, which also leads to reducing the overall decoding complexity. Simulation results show that the computational complexity of TPC decoding is significantly reduced via the proposed techniques, and the error correction performance remains nearly the same in comparison with that of conventional methods.

Highlights

  • Turbo product codes (TPC) are decoded in general using soft-input soft-output (SISO) decoding, as introduced by Pyndiah in 1994 [1, 2], which nearly achieves the Shannon capacity limit with reasonable decoding complexity

  • The relative complexity serves as the indicator for the comparison, which is useful in identifying the reduction in computational complexity from a macroscopic perspective

  • To resolve problems related to high complexity, we propose the low-complexity BFHDDHISO decoding algorithm

Read more

Summary

Introduction

Turbo product codes (TPC) are decoded in general using soft-input soft-output (SISO) decoding, as introduced by Pyndiah in 1994 [1, 2], which nearly achieves the Shannon capacity limit with reasonable decoding complexity. This algorithm was based on the Hamming distance results before and after performing error correction using an algebraic (or hard) decoding in the previous iteration This technique could effectively lower the decoding complexity in proportional to the decrement of the p value. Ahn et al [25] introduced a highly effective low-complexity decoding algorithm for TPC, which was based on the syndrome characteristics of extended Hamming codes In this algorithm, it was possible to determine the valid codeword conditionally by using only a single HDD operation when the single-error syndrome was identified. As a result, using the proposed two algorithms, it is possible to decrease the computational complexity considerably compared to conventional syndrome-based decoding algorithms; at the same time, the error correction performance is almost the same as before.

Background
Chase-Pyndiah decoding algorithm
Syndrome-based decoding algorithms
Scheme 1
Scheme 2
Results and discussion
Reduction of average half-iteration number through early termination
Relative complexity reduction analysis based on SISO decoding and HDD operation
Complexity reduction analysis of the required number of arithmetic operations
BER performance comparisons
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call