Abstract
In classical probability theory, there exist two important concepts which measure the amount of “information” of a given distribution. These are the Fisher information and the entropy. There exist various relations between these quantities, and they form a cornerstone of classical probability theory and statistics. Voiculescu introduced free probability analogues of these quantities, called free Fisher information and free entropy, denoted by Φ and χ, respectively. However, there remain some gaps in our present understanding of these quantities. In particular, there exist two different approaches, each of them yielding a notion of entropy and Fisher information. One hopes that finally one will be able to prove that both approaches give the same result, but at the moment this is not clear. Thus, for the time being, we have to distinguish the entropy χ and the free Fisher information Φ coming from the first approach (via microstates) and the free entropy χ∗ and the free Fisher information Φ∗ coming from the second non-microstates approach (via conjugate variables).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.