Abstract

In this paper, we are concerned with the maximin of mutual informations, which usually occurs in the context of specifying the capacity of a compound channel or the maximum rate of broadcasting common messages to multiple receivers. Firstly, the analytical properties of the minimum mutual informations such as continuity, concavity and differentiability are analyzed. The necessary and sufficient conditions for the capacity-achieving input distributions are revealed by reformulating the original problem into an equivalent differentiable form. Also presented are bounds on the capacity. Secondly, an iterative method to compute the maximin value of two mutual informations is derived, which is almost the same as the Arimoto-Blahut algorithm except that a more complicated maximization algorithm is required at the second phase. Finally, the convergence of the proposed algorithm is proved.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call