Abstract

In classical probability theory, there exist two important concepts which measure the amount of “information” of a given distribution. These are the Fisher information and the entropy. There exist various relations between these quantities, and they form a cornerstone of classical probability theory and statistics. Voiculescu introduced free probability analogues of these quantities, called free Fisher information and free entropy, denoted by Φ and χ, respectively. However, there remain some gaps in our present understanding of these quantities. In particular, there exist two different approaches, each of them yielding a notion of entropy and Fisher information. One hopes that finally one will be able to prove that both approaches give the same result, but at the moment this is not clear. Thus, for the time being, we have to distinguish the entropy χ and the free Fisher information Φ coming from the first approach (via microstates) and the free entropy χ∗ and the free Fisher information Φ∗ coming from the second non-microstates approach (via conjugate variables).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call