Abstract

In 1927, Earle Hesse Kennard derived an inequality describing Heisenberg’s uncertainty principle. Since then, we have traditionally been using the standard deviation as the measure of uncertainty in quantum mechanics. But Jan Hilgevoord asserts that the standard deviation is neither a natural nor a generally adequate measure of quantum uncertainty. Specifically, he asserts that the standard deviations are inadequate to use as the quantum uncertainties in the single- and double-slit diffraction experiments. He even tells that from these examples it will become clear that the standard deviation is the wrong concept to express the uncertainty principle generally and that the Kennard relation has little to do with the uncertainty principle. We will investigate what are adequate as the measures of quantum uncertainty. And, beyond that, we will investigate the effects of multiplying the two uncertainties; namely, characteristics which is hiding in deep interior of the Kennard inequality. Through investigations we’ll come to naturally realize that his assertions were wrong. All of our discussions will help raise understanding of the Heisenberg uncertainty principle. Our discussions will afford us an opportunity to think about the essence of the Fourier transform. The aim of this paper is to draw conclusions about whether the Kennard inequality is justified or not.

Highlights

  • Heisenberg’s uncertainty principle [1] is one of the most famous foundations of quantum mechanics

  • In regard to such a historical fact, we can doubt the followings: is there the reason why we use the standard deviation as the measure of uncertainty in quantum mechanics?; may we use some other quantities instead of using the standard deviation as the measure of quantum uncertainty?; if we may use some other quantities instead of the standard deviation, what are merits and demerits of each measure?; and so on

  • Measures of Spread in Mathematical Statistics. The author makes it clear that all of the contents stated in current section are facts on usual mathematical statistics which has absolutely nothing to do with quantum mechanics

Read more

Summary

Introduction

Heisenberg’s uncertainty principle [1] is one of the most famous foundations of quantum mechanics. Hilgevoord asserts that using the standard deviation in the experiments in which electrons are incident on a slit is unjustified [13]. In regard to such a historical fact, we can doubt the followings: is there the reason why we use the standard deviation as the measure of uncertainty in quantum mechanics?; may we use some other quantities instead of using the standard deviation as the measure of quantum uncertainty?; if we may use some other quantities instead of the standard deviation, what are merits and demerits of each measure?; and so on. We will be able to better understand the essence of Heisenberg uncertainty principle

Measures of Spread in Mathematical Statistics
Absolute Deviation as a Measure of Quantum Uncertainty
Standard Deviation as a Measure of Quantum Uncertainty
Confirmation of the Ideality of Absolute Deviation
Infimums in Uncertainty Relations
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.