Abstract

There is a well-known analogy between statistical and quantum mechanics. In statistical mechanics, Boltzmann realized that the probability for a system in thermal equilibrium to occupy a given state is proportional to \(\exp(-E/kT)\), where \(E\) is the energy of that state. In quantum mechanics, Feynman realized that the amplitude for a system to undergo a given history is proportional to \(\exp(-S/i\hbar)\), where \(S\) is the action of that history. In statistical mechanics, we can recover Boltzmann's formula by maximizing entropy subject to a constraint on the expected energy. This raises the question: what is the quantum mechanical analogue of entropy? We give a formula for this quantity, which we call ``quantropy''. We recover Feynman's formula from assuming that histories have complex amplitudes, that these amplitudes sum to one and that the amplitudes give a stationary point of quantropy subject to a constraint on the expected action. Alternatively, we can assume the amplitudes sum to one and that they give a stationary point of a quantity that we call ``free action'', which is analogous to free energy in statistical mechanics. We compute the quantropy, expected action and free action for a free particle and draw some conclusions from the results.

Highlights

  • There is a famous analogy between statistical mechanics and quantum mechanics

  • A system can be in any state, but its probability of being in a state with energy E is proportional to exp(−E/T ), where T is the temperature in units where Boltzmann’s constant is one

  • A system can move along any path, but its amplitude for moving along a path with action S is proportional to exp(−S/i~), where ~ is Planck’s constant

Read more

Summary

Introduction

There is a famous analogy between statistical mechanics and quantum mechanics. In statistical mechanics, a system can be in any state, but its probability of being in a state with energy E is proportional to exp(−E/T ), where T is the temperature in units where Boltzmann’s constant is one. The probabilities exp(−E/T ) arise naturally from maximizing entropy subject to a constraint on the expected value of energy. We might guess that the amplitudes exp(−S/i~) arise from maximizing some quantity subject to a constraint on the expected value of action. This quantity deserves a name, so let us tentatively call it “quantropy”. A less naive program is to derive the amplitudes in quantum mechanics from a “principle of stationary quantropy” We do this for a class of discrete systems and illustrate the idea with the example of a free particle, discretizing both space and time.

Statics
Dynamics
Quantropy
Computing Quantropy
The Quantropy of a Free Particle
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.