Abstract

The quantum theory (QT) and new stochastic approaches have no deterministic prediction for a single measurement or for a single time‐series of events observed for a trapped ion, electron or any other individual physical system. The predictions of QT being of probabilistic character apply to the statistical distribution of the results obtained in various experiments. The Copenhagen interpretation (CI) of QT acknowledged the abstract and statistical character of the predictions of QT but at the same time claimed that a state vector Ψ provided complete description of each individual physical system. The assigning the state vector to an individual physical system together with a postulate of its instantaneous reduction in the measurements was shown by Einstein Podolski and Rosen to lead to so called EPR paradox for the experiments with the entangled pairs of the particles. EPR concluded that a state vector could not provide a complete description of the individual systems and the question arose whether the probabilistic predictions of QT could be derived from some more fundamental spatio‐temporal deterministic description of the invisible sub‐phenomena by introduction of supplementary parameters. The experimental violation of the Bell inequalities in the spin polarization correlation experiments (SPCE) which were the implementations of Bohm and EPR gedanken experiments eliminated so called local and realistic models of the sub‐phenomena. Quite often the violation of these inequalities has been incorrectly interpreted as a proof of the completeness of QT or as the violation of the locality and causality in the micro‐world. The local and realistic models overlooked the fact that an experimental outcome is only the information about a particular system‐system or system‐instrument interaction. Namely the quantum phenomena are described in terms of the probabilities. It is well known that the probability distribution is not an attribute of a dice but it is a characteristic of a whole random experiment: “rolling a dice”. In this paper we will underline: that the quantum probabilities do not describe the lack of knowledge about the physical systems but the lack of knowledge about the experiments and therefore they are “contextual” and that the statistical long range correlations between two random variables X and Y are not a proof of any causal relation between these variables. Moreover any probabilistic model used to described a random experiment is consistent only with a specific protocol telling how the random experiment has to be performed. The probabilistic model used to prove Bell inequalities implied a protocol completely inappropriate and impossible to implement for SPCE . We conclude that the meaningful question about the predictable completeness of QT is still open and it should be answered by a careful and unconventional analysis of the experimental data. The correct understanding of statistical and contextual character of QT is essential for the research in the domain of quantum information and quantum computing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call