Abstract
Computing the LP bound for network coding capacity and proving a basic information inequality are linear optimization problems. The number of dimensions and constraints of the problems increase exponentially with the number of random variables involved. In the first instance, producing constraints with exponential size exhausts computational memory resources as the number of random variables increases. Secondly, the well known simplex algorithm for solving linear programming problems has exponential worst case complexity in the problem size, making it doubly exponential in the number of random variables. In this correspondence, we focus on generating a set of constraints with significantly reduced size and yet characterizing the same feasible region for these optimization problems. As a result, it is now possible to produce constraint sets for problems with larger number of random variables which was practically impossible due to limited memory resources. Moreover, reduction in problem size also means solving the problems faster.
Accepted Version
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have