For some time it has been known that, for fixed code length <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</tex> , binary BCH codes appear to be most efficient when the number of information bits <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</tex> is between <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1/4 n</tex> and <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3/4 n</tex> [1, p. 443], [2, p. 219]. In this correspondence the efficiency of block codes on an binary-quantized additive white Gaussian noise channel is analyzed as a function of the code rate <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">r = k/n</tex> for hard decision decoding. A closed form analytical expression for the upper and lower bounds on block code performance is derived for large code lengths <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</tex> . They show that, for best codes, a relatively broad maximum occurs for rates of approximately 0.4. The performance of the BCH codes is also compared with the bounds.
Read full abstract