Abstract

This paper shows that many notions in information theory (e.g. entropy, capacity regions) can be defined using probabilistic independence alone. We consider the first-order theory of random variables with the probabilistic independence relation, which concerns statements consisting only of random variables, the probabilistic independence symbol, logical operators, and existential and universal quantifiers. Although probabilistic independence is the only non-logical relation included, this theory is surprisingly expressive, and is able to express notions such as entropy and cardinality, and interpret the true first-order arithmetic over natural numbers (and hence is undecidable). We also characterize the capacity region for a general class of multiuser coding settings (including broadcast channel, interference channel and relay channel) using a first-order formula, which can be regarded as a "single-letter characterization" of the capacity region of the aforementioned settings (conventional single-letter characterizations are existential formulae, whereas our formula contains both existential and universal quantifiers).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.