Abstract
For general memoryless systems, the typical information theoretic solution, when exists, has a ldquosingle-letterrdquo form. This reflects the fact that optimum performance can be approached by a random code (or a random binning scheme), generated using independent and identically distributed copies of some single-letter distribution. Is that the form of the solution of any (information theoretic) problem? In fact, some counter examples are known, perhaps the most famous being the Korner-Marton ldquotwo help onerdquo problem, where the modulo-two sum of two binary sources is to be decoded from their independent encodings. In this paper we provide another counter example, the ldquodoubly-dirtyrdquo multiple access channel (MAC). Like the Korner-Marton problem, this example is associated with a multiterminal scenario where side information is distributed among several terminals; each transmitter knows part of the channel interference but the receiver is not aware of any part of it. We give an explicit solution for the capacity region of a binary version of the doubly-dirty MAC, demonstrate how this capacity region can be approached using a linear coding scheme, and prove that the ldquobest known single-letter regionrdquo is strictly contained in it. We also state a conjecture regarding a similar rate loss of single letter characterization in the Gaussian case.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.