Abstract

Directed information or its variants are utilized extensively in the characterization of the capacity of channels with memory and feedback, nonanticipative lossy data compression, and their generalizations to networks. In this paper, we derive several functional and topological properties of directed information, defined on general abstract alphabets (complete separable metric spaces), using the topology of weak convergence of probability measures. These include the convexity of the set of consistent distributions, which uniquely define causally conditioned distributions, convexity, and concavity of directed information with respect to the sets of consistent distributions, weak compactness of such sets of distributions, their joint distributions, and their marginals. Furthermore, we show lower semicontinuity of directed information, and under certain conditions, we also establish continuity. Finally, we derive variational equalities for directed information, including sequential versions. These may be viewed as the analog of the variational equalities of mutual information (utilized in Blahut–Arimoto algorithms). In summary, we extend the basic functional and topological properties of mutual information to directed information. These properties are discussed throughout this paper, in the context of extremum problems of directed information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call