Abstract

Shannon entropy as well as conditional entropy and mutual information can be compactly expressed in terms of the relative entropy, or Kullback–Leibler divergence. In this sense, the divergence can be seen as a parent quantity to entropy, conditional entropy and mutual information, and many properties of the latter quantities can be derived from properties of the divergence. Similarly, we will define Renyi entropy, conditional entropy and mutual information in terms of a parent quantity, the Renyi divergence. We will see in the following chapters that this approach is very natural and leads to operationally significant measures that have powerful mathematical properties. This observation allows us to first focus our attention on quantum generalizations of the Kullback–Leibler and Renyi divergence and explore their properties, which is the topic of this chapter.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call