Abstract

For coded transmission over a memoryless channel, two kinds of mutual information are considered: the mutual information between a code symbol and its noisy observation and the overall mutual information between encoder input and decoder output. The overall mutual information is interpreted as a combination of the mutual informations associated with the individual code symbols. Thus, exploiting code constraints in the decoding procedure is interpreted as combining mutual informations. For single parity check codes and repetition codes, we present bounds on the overall mutual information, which are based only on the mutual informations associated with the individual code symbols. Using these mutual information bounds, we compute bounds on extrinsic information transfer (exit) functions and bounds on information processing characteristics (ipc) for these codes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call