Abstract

Realist, no-collapse interpretations of quantum mechanics, such as Everett's, face the probability problem: how to justify the norm-squared (Born) rule from the wavefunction alone. While any basis-independent measure can only be norm-squared (due to the Gleason-Busch Theorem) this fact conflicts with various popular, non-wavefunction-based phenomenological measures - such as observer, outcome or world counting - that are frequently demanded of Everettians. These alternatives conflict, however, with the wavefunction realism upon which Everett's approach rests, which seems to call for an objective, basis-independent measure based only on wavefunction amplitudes. The ability of quantum probabilities to destructively interfere with each other, however, makes it difficult to see how probabilities can be derived solely from amplitudes in an intuitively appealing way. I argue that the use of algorithmic probability can solve this problem, since the objective, single-case probability measure that wavefunction realism demands is exactly what algorithmic information theory was designed to provide. The result is an intuitive account of complex-valued amplitudes, as coefficients in an optimal lossy data compression, such that changes in algorithmic information content (entropy deltas) are associated with phenomenal transitions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call