Back to comments overview.

# Comments on "100 Years of Quantum Mysteries"

**To the February 2001 issue of Scientific American, Max Tegmark and John Archibald Wheeler contributed an article titled "100 Years of Quantum Mysteries", giving an overview on the foundations of quantum mechanics. (This article is also available on the arXiv as "100 Years of the Quantum".) Roderich Tumulka provided the following comments that point out several flaws in their article.
**

What was this quantity, the "wave function", which Schrödinger's equation described? This central puzzle of quantum mechanics remains a potent and controversial issue to this day.

It is remarkable that Wheeler and Tegmark see the "central puzzle" not in the inconsistency revealed by the measurement problem, or in the fact that orthodox quantum mechanics does not give any understandable description of what actually happens, or that it is rather unclear what quantum mechanics finally *says*. Instead, they see the main problem of QM in the question "What is the wave function?".

What sort of answer do they expect? The wave function made of semolina pudding? Why isn't the answer simply "The wave function is a new object unknown to classical physics, but is one of the objects in a quantum world just like the Lorentzian metric is a part of a general relativistic world"?

It is hard to understand what this question means at all: What is the wave function? And it is remarkable that it is hard to understand what the problem is about Wheeler and Tegmark's main problem about QM.

According to quantum physics, a card perfectly balanced on its edge will by symmetry fall down in both directions at once

Is this true? This is the first time I see such an experiment presented as a paradigm of a quantum experiment. Is it really true that one can prepare a card in such a way that the wave function amplitude for falling down face down is comparably great to the amplitude for falling down face up? Note that if you prepare the wave function in a way that is not completely symmetric, the total wave function may "fall down to the right", just like in classical mechanics. This experiment is particularly ill-suited as an example of a quantum experiment, since one cannot test whether there really ever was a superposition: in contrast, a two-way experiment can be set up where one can choose between detecting which-way and an interference pattern that proves there really is a superposition.

If our observer repeats the experiment with four cards, there will be 2x2x2x2=16 outcomes (see figure). In almost all of these cases, it will appear to her that queens occur randomly, with about 50% probability.

This attempt to derive probabilities from many branches of the wave function is incorrect. Think of an experiment where the probability predicted by QM is not .5, but .3, and repeat that 1000 times. Then there are 2^{1000} branches of the wave function, one corresponding to every sequence of results. Most branches have an "up" frequency close to .5, not to .3 as predicted. So why do we always end up in these quite few branches where the frequency is close to .3?

According to 1909 theorem by the French mathematician Borel, she will observe queens 50% of the time in almost all cases (in all cases except for what mathematicians call a set of measure zero) in the limit where she repeats the card experiment infinitely many times.

If the theorem referred to is the strong law of large numbers, then I must note this theorem *presupposes* that all the probabilities are those of a binomial distribution after finitely many experiments. In our case, however, there are no probabilities at all for a finite repetition of the experiment: the many-worlds view does not lead to *any* probabilities, as there exist parallel copies of the observers which perceive different statistical laws. Some observe a frequency of .5, others a frequency of .3. How can one say one of them is more legitimate than the other?

Max Born had the dramatic insight that the wave function should be interpreted in terms of probabilities.

Obviously the wave function is more than mere probabilities, so it cannot be reduced to probabilities.

If we measure the location of an electron, the probability of finding it in a given region depends on the intensity of its wave function there. [...] Einstein was deeply unhappy with this interpretation...

The first statement is certainly true. And certainly not something Einstein would have objected to.

[Einstein] expressed his preference for a deterministic Universe [...] Schrödinger was also uneasy.

Wheeler and Tegmark give the impression the primary (or perhaps only) reason to feel uneasy about orthodox QM is its lack of determinism. This impression is ridiculous; GRW is an indeterministic theory and acceptable because it is consistent which orthodox QM is not.

Although [the collapse rule] provided a strikingly successful calculational recipe, there was a lingering feeling that there ought to be some equation describing when and how this collapse occurred.

Wheeler and Tegmark seem unaware of the fact that if one treats the measurement apparatus or observer (which itself is made out of electrons and nuclei) with the Schrödinger equation, one meets a *logical* problem: the measurement problem. This problem is not just a matter of taste.

"many minds"

Wheeler and Tegmark are unaware that "many minds" is a technical term that is not synonymous to "many worlds" but is a rather different proposal put forward by Albert and Loewer in 1988.

Could the apparent quantum randomness be replaced by some kind of unknown quantity carried about inside particles, so-called "hidden variables"?

The hidden variables Einstein, Podolsky and Rosen advocated in their famous 1935 paper was position and momentum. The best known existing hidden-variables theories today, those of Bohm and of Nelson, have particle positions as hidden variables. So how can Wheeler and Tegmark suggest that hidden variables are typically "carried about inside the particles"?

CERN theorist John Bell showed that in this case, quantities that could be measured [...] would inevitably disagree with standard quantum predictions. After many years, technology allowed researchers to conduct these experiments and eliminate hidden variables as a possibility.

Bell's argument does not exclude hidden variables theories, but local theories, i.e. theories that don't contain superluminal influences. Note that both Bohm's and Nelson's theory make the correct prediction (the one confirmed by the experiment).

Anton Zeilinger's group in Vienna has even started discussing doing [a double slit experiment] with a virus. If we imagine, as a Gedanken experiment, that this virus has some primitive kind of consciousness, then the many worlds/many minds interpretation seems unavoidable, as has been emphasized by Dieter Zeh.

Apart from the nonstandard usage of the term "Gedanken experiment": why should this experiment be excluded by Bohmian mechanics, or stochastic mechanics, or GRW?

As David Deutsch has emphasized, it will be hard to deny the reality of all these parallel states [i.e., superpositions] if such [i.e., quantum] computers are actually built.

Neither Bohmian mechanics, nor stochastic mechanics, nor GRW denies the existence of superpositions: Wave functions play a crucial role in these theories and wave functions can of course be added.

Our fallen quantum card [..] is constantly bumped by snooping air molecules, photons, etc., which thereby find out whether it has fallen to the left or to the right, destroying ("decohering") the superposition and making it unobservable.

Indeed, decoherence makes the superposition unobservable, but it does not destroy the superposition. This is an immediate and obvious consequence of the linearity of the time evolution of the wave function.

A density matrix having the form (a 0 \\ 0 b) would represent a familiar classical situation

This is not necessarily so.

1) The density matrix assigned to an ensemble of wave functions, described by a measure M on the unit sphere in Hilbert space, is the integral of |psi><psi| M(dpsi). This assignment is many-to-one, i.e. many different measures (probability distributions) on the set of possible states (the unit sphere in Hilbert space) have the same density matrix. It is not justified to say that the above density matrix *represents* the ensemble with probability 1/2 for |1> and probability 1/2 for |2> and probability 0 for all other states like (1/sqrt(2)) (|1> + |2>): just as well one might say this density matrix represents the measure that gives probability 1/2 to each of (1/sqrt(2)) (|1> + |2>) and (1/sqrt(2)) (|1> - |2>), and probability zero to each of |1> and |2>. Or the uniform measure on the unit sphere in 2-dimensional Hilbert space. The word "represents" suggests uniqueness where it is not present.

2) Density matrices do not always represent statistical ensembles of wave functions. They also occur as partial traces of other density matrices that refer to more particles. E.g., if a two-particle system has the state vector psi= sqrt(a) |1>|up> + sqrt(b) |2>|down>, then we may write the density matrix as |psi><psi| and take the partial trace over the second factor: this yields the matrix (a 0 \\ 0 b). But this does *not* mean the state vector of particle 1 is either |1> or |2>, with probabilities a and b. It means particle 1 does not possess a state vector of its own, because it is entangled to particle 2, and the common state vector of particle 1 and particle 2 is psi with probability 1.

it was embarassing that nobody had managed to provide a testable deterministic equation specifying precisely when this mysterious collapse was supposed to occur.

It is true that nobody has managed to provide a testable deterministic equation specifying precisely when the collapse is supposed to occur. But why is it necessary that this equation be deterministic? GRW have provided a testable stochastic equation specifying precisely when the collapse is supposed to occur. But Wheeler and Tegmark do not seem to know GRW.

Bohm (an ontological interpretation where an auxiliary "pilot-wave" allows particles to have well-defined positions and velocities)

It is hard to think of a more stupid way to describe Bohmian mechanics. The "auxiliary" wave is nothing but the wave function of quantum mechanics. What could be meant by saying the wave function *allows* the particles to have well-defined positions and velocities is completely obscure. This phrase simply makes no sense at all. Bohm's theory says the particles have well-defined positions and velocities. That's a reasonable statement to make. And a true one.

The reader is warned of rampant linguistic confusion in this area.

I expect now comes something about the word "measurement" and its inappropriateness. (It doesn't.)

... could save astute students many hours of frustrated confusion.

That's a nice aim.

Back to comments overview.