Abstract
Given n discrete random variables /spl Omega/={X/sub 1/, /spl middot//spl middot//spl middot/, X/sub n/}, associated with any subset /spl alpha/ of (1, 2, /spl middot//spl middot//spl middot/, n), there is a joint entropy H(X/sub /spl alpha//) where X/sub /spl alpha//={X/sub i/:i/spl epsiv//spl alpha/}. This can be viewed as a function defined on 2/sup {1, 2, /spl middot//spl middot//spl middot/, n}/ taking values in (0, +/spl infin/). We call this function the entropy function of /spl Omega/. The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional joint entropies implies that this function is nondecreasing; and the nonnegativity of the conditional mutual information implies that this function has the following property: for any two subsets /spl alpha/ and /spl beta/ of {1, 2, /spl middot//spl middot//spl middot/, n} H/sub /spl Omega//(/spl alpha/)+H/sub /spl Omega//(/spl beta/)/spl ges/H/sub /spl Omega//(/spl alpha//spl cup//spl beta/)+H/sub /spl Omega//(/spl alpha//spl cap//spl beta/). These properties are the so-called basic information inequalities of Shannon's information measures. Do these properties fully characterize the entropy function? To make this question more precise, we view an entropy function as a 2/sup n/-1-dimensional vector where the coordinates are indexed by the nonempty subsets of the ground set {1, 2, /spl middot//spl middot//spl middot/, n}. Let /spl Gamma//sub n/ be the cone in R/sup 2n-1/ consisting of all vectors which have these three properties when they are viewed as functions defined on 2/sup {1, 2, /spl middot//spl middot//spl middot/, n}/. Let /spl Gamma//sub n/* be the set of all 2/sup n/-1-dimensional vectors which correspond to the entropy functions of some sets of n discrete random variables. The question can be restated as: is it true that for any n, /spl Gamma/~/sub n/*=/spl Gamma//sub n/? Here /spl Gamma/~/sub n/* stands for the closure of the set /spl Gamma//sub n/*. The answer is "yes" when n=2 and 3 as proved in our previous work. Based on intuition, one may tend to believe that the answer should be "yes" for any n. The main discovery of this paper is a new information-theoretic inequality involving four discrete random variables which gives a negative answer to this fundamental problem in information theory: /spl Gamma/~*/sub n/ is strictly smaller than /spl Gamma//sub n/ whenever n>3. While this new inequality gives a nontrivial outer bound to the cone /spl Gamma/~/sub 4/*, an inner bound for /spl Gamma/~*/sub 4/ is also given. The inequality is also extended to any number of random variables.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.