Abstract

Von Neumann defined the informational entropy of a density matrix ρ by the expression S(ρ)=−tr(ρ ln ρ). [J. Von Neumann, Die Mathematischen Grundlagen der Quantummechanik (Springer-Verlag, Berlin, 1932)]. Here, starting from the definitions of Shannon entropy and of Renyi entropy of random variables, and using some (reasonable) rules of inference, two expressions are obtained for the quantum entropy of a given square (deterministic) matrix, irrespective of any probabilistic framework and/or any randomization technique. These definitions apply directly to linear operators on Hilbert spaces even when they are not of positive trace class, and they contain Von Neumann entropy as a special case. These new measures of uncertainty could provide new approaches to some problems related to structure complexity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.