Abstract

A definition of three-variable cumulative residual entropy is introduced, and then used to obtain expressions for higher order or triple-wise correlation measures, that are based on cumulative residual densities. These information measures are calculated in continuous variable quantum systems comprised of three oscillators, and their behaviour compared to the analogous measures from Shannon information theory. There is an overall consistency in the behaviour of the newly introduced measures as compared to the Shannon ones. There are, however, differences in interpretation, in the case of three uncoupled oscillators, where the correlation is due to wave function symmetry. In interacting systems, the cumulative based measures are shown in order to detect salient features, which are also present in the Shannon based ones.

Highlights

  • One interpretation of quantum mechanics is that of a statistical theory, analysis of the information obtained from the underlying densities of quantum systems is essential in understanding quantum phenomena

  • These information measures are calculated in continuous variable quantum systems comprised of three oscillators, and their behaviour compared to the analogous measures from Shannon information theory

  • The behaviours of the information measures in the uncoupled oscillators are examined as a function of the ω frequency

Read more

Summary

Introduction

One interpretation of quantum mechanics is that of a statistical theory, analysis of the information obtained from the underlying densities of quantum systems is essential in understanding quantum phenomena. Important tools for interpreting this behaviour have evolved from consideration of position and momentum densities. A key theme in this regard is the measure of the uncertainty that is inherent in any probability distribution. A related concept for distributions with two or more variables is the quantification of the statistical correlation that exists between variables. One way to achieve these goals is with Shannon information theory [1,2], where the central quantity is an information entropy. The original definition of Shannon entropy is in terms of N discrete random variables

Objectives
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.