Abstract

In this paper we prove a countable set of non-Shannon-type linear information inequalities for entropies of discrete random variables, i.e., information inequalities which cannot be reduced to the basic inequality I(X : Y |Z) 0. Our results generalize the inequalities of Z. Zhang and R. Yeung (1998) who found the first examples of non-Shannon-type information inequalities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call