Abstract

This letter considers communication over the Gaussian multiple access channel (MAC) with multiple sources. A sequence of length- $n$ codes for the Gaussian MAC is said to be capacity-achieving if the codes achieve the sum-rate capacity. It is shown that for any sequence of capacity-achieving codes, the normalized relative entropy between the output distribution resulted from the interaction between the length- $n$ code and the Gaussian MAC, denoted by $p_{Y^n}$ , and the $n$ -fold product of the capacity-achieving output distribution $p_Y^*$ , denoted by $p_{Y^n}^*$ , converges to zero as the blocklength grows, i.e., $\frac{1}{n}D(p_{Y^n}\Vert p_{Y^n}^*)\rightarrow 0$ . Applications of the convergence result are also discussed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call