Abstract

Mapping the structure of the entropy region of at least four jointly distributed random variables is an important open problem. Even partial knowledge about this region has far reaching consequences in other areas in mathematics, like information theory, cryptography, probability theory and combinatorics. Presently, the only known method of exploring the entropy region is, or equivalent to, the one of Zhang and Yeung from 1998. Using some non-trivial properties of the entropy function, their method is transformed to solving high dimensional linear multiobjective optimization problems. Benson’s outer approximation algorithm is a fundamental tool for solving such optimization problems. An improved version of Benson’s algorithm is presented, which requires solving one scalar linear program in each iteration rather than two or three as in previous versions. During the algorithm design, special care is taken for numerical stability. The implemented algorithm is used to verify previous statements about the entropy region, as well as to explore it further. Experimental results demonstrate the viability of the improved Benson’s algorithm for determining the extremal set of medium-sized numerically ill-posed optimization problems. With larger problem sizes, two limitations of Benson’s algorithm is observed: the inefficiency of the scalar LP solver, and the unexpectedly large number of intermediate vertices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call