*Principia*in 1687 and Newtonian mechanics proved more than adequate for well over two centuries. Newton had also speculated that light consisted of a stream of particles, but in 1801 Thomas Young demonstrated that light exhibits interference when passing through two narrow and closely spaced slits, demonstrating that light has a wave-like nature in contrast to the particle-like nature of Newton’s conjecture. In 1814 Pierre-Simon Laplace proposed that a sufficiently powerful intelligence could use Newtonian mechanics to predict all future events from present conditions - a view known as

*determinism*and for which there would be no serious challenges for more than a century. In 1900 Lord Kelvin (William Thomson) identified two outstanding “cloudlets” on the horizon of physics, famously declaring the science of physics to be almost complete. The first of these "cloudlets" was the fact that the Michelson-Morely experiment had failed to find the expected evidence of the “luminiferous aether”. The second was the fact that physical theory failed to correctly predict the profile of the spectrum of light emitted by glowing hot bodies (so-called “black body radiation”) - a problem that later came to be known as the

*ultraviolet catastrophe*. These two “cloudlets” turned out to be in need of far more than just a little “tidying-up”.

In 1905 Albert Einstein published his
paper outlining what is now known as the Special Theory of Relativity. This is
a theory that subsumes Newtonian mechanics under a new paradigm that extends to
velocities approaching that of light, where Newtonian mechanics shows its
limitations. With an inspirational change of paradigm concerning the way that
velocities add together, Einstein showed that the speed of light is independent
of reference-frame (accounting for the first of the two problems identified by
Kelvin). A significant consequence of the Special Theory of Relativity is that
information cannot be transferred faster than light, and this leads to the
conclusion that no two events can be considered simultaneous in any

*absolute*sense of the word - rather any such simultaneity is dependent upon reference frame, or is*relative*. Hermann Minkowski subsequently contributed to Special Relativity by showing how space and time must be unified in a manner in which they are no longer mutually independent variables, but instead comprise a four-dimensional “space-time” model of the universe.
In 1901 Max Planck made an attempt to
explain the profile of the black body emission spectrum (the second of the two
problems identified by Kelvin). In a leap of imagination he proposed that
radiation is emitted and absorbed in discrete amounts that he called “quanta”,
and this seemed to give the correct profile for the emission spectrum. In 1905
Einstein produced a paper on the photoelectric effect in which (despite the
fact that Young had shown light to have a wave-like nature) he had to conclude
that that light was acting as a stream of particles (later to be called

*photons*). So instead of quantization being a characteristic of the process of emission and absorption as Planck had assumed, Einstein had shown it to be a characteristic of the radiation itself (a conclusion that was resisted until 1923 when Compton’s scattering experiments clinched Einstein’s conclusion). An intriguing aspect was that the description of light as wave or particle was determined by the experimenter’s choice of experimental apparatus: Young’s slits showed light to have wave-like properties and Einstein’s photoelectric effect showed light to have particle-like properties.
The idea of making a measurement upon a
classical (i.e. non-quantum) system poses no great conceptual difficulties - we
select the attribute to be measured and conduct the measurement - but things
are not so simple in the case of quantum systems. It appeared that, whilst
between measurements, quantum systems can have attributes that would be
mutually incompatible from a classical perspective, analogous to a classical object
being (for example) in two places at the same time. This “superposition of
states” that quantum systems seem to adopt between measurements imposes
constraints upon what can be expected when the next measurement takes place,
but not in a manner that makes any sense in classical terms. A new theory was
required that subsumes Newtonian mechanics under a new paradigm that extends to
the very small, where Newtonian mechanics shows its limitations. This was
“quantum mechanics”.

In 1925/26 Erwin Schrodinger presented his
famous “wave equation” which mathematically models how a quantum system evolves

*continuously*and*deterministically*whilst between measurements, but as a*superposition of states*. The problem remained that whenever a measurement is made upon a quantum system it is always found to be in one and only one of the states that appear in superposition in the Schrodinger equation, and by virtue of the act of measurement it is said to have undergone “state vector reduction”. It seemed, then, that the act of measurement brings about a*discontinuous*and*non-deterministic*transition from the superposition described by the wave equation to the actual measured value.
In 1926 Max Born postulated that the
Schrodinger wave equation should be interpreted in terms of

*probability*. That is to say that the variable described by the wave equation (given in superposition whilst between measurements) should be interpreted (for each of the states in that superposition) as the*probability*that that particular state will become the*actual*state of the system if a measurement were to be made at that time. Born’s postulate admits of two interpretations. Firstly the classical interpretation, that we are constrained to work in terms of probabilities because our knowledge of the system is incomplete. Secondly the non-classical interpretation, that quantum systems differ from classical systems because the probability associated with them is*irreducible*and*ineliminable*. The latter interpretation was anathema to some physicists, including Einstein, who held that the conceptual objects with which science deals continue to exist and to have well defined attributes even when not being measured (a form of*metaphysical**realism*). Einstein famously claimed in a letter to Born that “the Old One does not play dice".
The new “quantum mechanics” was fiercely
debated at the fifth Solvay Conference in Copenhagen in 1927, where Nils Bohr
rejected Einstein’s realism on the grounds that scientific theories are nothing
more than “instruments” that allow us to make more accurate predictions about
the world, and tell us nothing about any putative underlying “reality” (a view
known as

*instrumentalism*). Furthermore, if different kinds of apparatus give us seemingly incompatible descriptions of a quantum system (e.g. light as both a wave and a particle) then these incompatible descriptions must be considered as*complementary*and no further interpretation should be placed upon them.
So, consistent with Laplacian determinism,
for all practical purposes a

*classical*system may be conceived as completely described by all of its attributes at all times, and any uncertainty in that description would be a consequence of our inability to make sufficiently precise measurements upon one or more of those attributes. However, Werner Heisenberg (initially siding with Bohr on the*instrumentalist*side of the debate) claimed that it is meaningless to regard*quantum*systems as having any kind of “reality” at all between interactions. He presented a mathematical argument showing that quantum systems differ significantly from classical systems in that they entail an*inherent*uncertainty or indeterminism. It turns out that precise measurements upon certain pairs of attributes of a quantum system - e.g. position and momentum - cannot be made simultaneously. A precise measurement of position will leave the momentum indeterminate, and vice versa. This is the “Heisenberg Uncertainty Principle”, which gives a formal footing to the strange aspect of quantum mechanics mentioned above: how the system behaves depends upon what the physicist chooses to measure.
Because of the venue for this conference,
the interpretation placed on quantum mechanics by Bohr, Heisenberg, and other
physicists that similarly rejected Einstein’s realism eventually became known
as the “Copenhagen Interpretation”. In the Copenhagen Interpretation, state
reduction became known as the “collapse of the wave function”, and this
“collapse” raised the serious question of how to reconcile these two aspects in
the evolution of quantum systems - continuous and deterministic whilst between
measurements, but discontinuous and probabilistic upon measurements. This is
known as the “measurement problem”.

In 1935 Schrodinger (siding with Einstein
on the realist side of the debate) proposed a thought-experiment in which he
established a link between the quantum world and the classical world in order
to demonstrate what he considered to be the absurdity of the Copenhagen
Interpretation. Schrodinger envisaged a cat hidden from view in a box in which
there is also a vial of poison. The vial is broken upon detection of the decay
of a radioactive atom, whereupon the cat would die. The quantum state of the
radioactive atom would be in a superposition of two states, both decayed and undecayed,
until a measurement took place. Schrodinger’s claim was that, according to the
Copenhagen Interpretation, the cat (a classical object) would also be in a
superposition of two (classically incompatible) states - both alive and dead -
until a measurement took place, and that this was a preposterous suggestion.
Schrodinger’s thought-experiment emphasised the need for a more rigorous
consideration of the idea of a “measurement” in quantum mechanics.

In 1935 Einstein produced a paper, along
with his co-workers Boris Podolski and Nathan Rosen, in which he used an
elegant thought-experiment in an attempt to endorse his realist interpretation
of quantum mechanics. That thought-experiment is now known as the “EPR paradox”
(from the names Einstein, Podolski, and Rosen). The EPR scenario begins with
two quantum systems that are prepared such that they exhibit a correlation in
respect of a common attribute (the most frequently used example of such an
attribute is quantum mechanical spin, though this is not the one used in the
original paper). The two quantum systems are then allowed to separate to a
distance at which the normal modes of information transfer would introduce a significant
delay. When the attribute of interest is measured at one of the systems, the
correlated attribute at the other (as-yet unmeasured) system would also

*immediately*be determined (because of the correlation). Now, according to classical theory, any correlation between two events is explained in one of two ways - either one event is the (direct or indirect)*cause*of the other, or both events are the*effects*of a common cause. According to the EPR paper the*immediate*nature of the reduction would eliminate the first of these explanations since this would violate Einstein’s postulate (from Special Relativity) that information cannot be transferred faster than light. This leaves only the “common cause” explanation, and since there were no known variables that could point back to such a common cause, Einstein argued that quantum mechanics was an*incomplete*account awaiting the discovery of new variables that would complete the account. This became known as the “hidden variables” hypothesis. However, in 1964 John Bell produced a mathematical theorem eliminating the common cause scenario, and this was experimentally confirmed by Alain Aspect in 1977. This seemed to leave the first explanation as the only option, but if this was to be given any credence then faster-than-light influences had to be entertained. In physics parlance any kind of influence that is mediated at or below the speed of light is known as a*local*influence, and so Bell’s theorem and the Aspect experiment, by eliminating*local*hidden variables, showed that*nonlocal*(i.e. faster-than-light) influences must be introduced into quantum mechanics. Indeed realist interpretations of quantum mechanics have been proposed on the basis of nonlocal hidden variables, notably by Louis de Broglie and later, building on de Broglie’s work, by David Bohm, but this avenue remains problematic and is not widely accepted.
In 1966 Cornelis Willem Rietdijk attempted
a rigorous proof of determinism that would be consistent with Minkowski’s
block-universe model of space-time (in contrast to the

*indeterminism*that had been introduced by the Copenhagen Interpretation of quantum mechanics). This kind of deterministic model would implicate a nonlocal hidden variables theory like that of Bohm. In 2001 Antoine Suarez set up an experiment that made use of the*relative*nature of simultaneity (as described by Special Relativity) to test for such nonlocal hidden variables. By suitable choice of reference frame in the case of the reduction of an entangled quantum system, the simultaneity of the otherwise*immediate*reduction can be turned into a sequence “A*before*B” (permitting A to be the nonlocal*cause*of B). But by suitable choice of a*different*reference frame, the same reduction can be turned into the sequence “B*before*A” (permitting B to be the nonlocal*cause*of A). Hence Suarez’ experiment is known as the “before-before” experiment. Suarez found that nonlocal correlations persist regardless of reference frame, eliminating the postulate of nonlocal hidden variables and so*disconfirming*Rietdijk’s proof of determinism. So Bell had eliminated*local*hidden variables and now Suarez had eliminated*nonlocal*hidden variables. This presents a problem for realist interpretations of quantum mechanics like that of Bohm. Suarez concluded that “nonlocal correlations have their roots outside of space-time”.
In 1932 John von Neumann, pursuing the
Copenhagen Interpretation, showed that there are no formal grounds for locating
wave function collapse at any point in the chain of events that begins with the
quantum system interaction, moves up through the experimental apparatus, through
the nervous system of the observer, and terminates in the observer’s
consciousness. This argument permits speculation regarding the extreme ends of
this chain, one such extreme leading to the suggestion that “consciousness
causes collapse”. The best known expression of this suggestion came in 1967
when Eugene Wigner proposed a variant of the “Schrodinger’s Cat” scenario known
as “Wigner’s Friend”. However, the idea that “consciousness causes collapse”
has many problems and is difficult to take seriously. The opposite extreme is
more enticing, this being the view that wave function collapse takes place upon
each and every interaction of the quantum system.

In 1957 Hugh Everett III proposed an
interpretation of quantum mechanics that was intended to eliminate the
troublesome idea of wave function collapse altogether. He used the idea that
any interaction between two systems results in an “entangled” system (as in the
EPR scenario), and extended this idea to include the measuring apparatus
itself. In this case, then, a measurement results in the quantum system under
measurement becoming entangled with the measuring apparatus. This way, the
state of the measured system is always

*relative*to the state of the measuring equipment, and vice versa. Instead of a wave function collapse that inexplicably favours a*single*actual outcome, each measurement generates*all*of the possible outcomes, with each outcome consisting of an entangled combination of measured system and measuring equipment. The entire ensemble of entangled systems would then exist in a superposition of states, with each individual entanglement continuing on*as if in a world of its own*. He called this interpretation the “relative state formulation”, but it is better known today as the “many worlds interpretation”. This scenario was not received well at the time of publication, but more recent variations of it have gained popularity amongst a subset of physicists today.
To recap, the Copenhagen Interpretation of
quantum mechanics entails a

*single*world in which the superposition of states (given by the Schrodinger wave equation) “collapse” upon measurement to a single state (whatever we may mean by “measurement”) in accordance with irreducible probabilities. This*indeterminism*yields an “open future” - i.e. a future that does not yet exist - but it says nothing about the past. It would be consistent with the view that only the present exists - a view of time called*presentism*- and with the view that the past and the present exist but not the future - a view of time called*possibilism*. However, the demise of absolute simultaneity as described in Special Relativity and in the Minkowski model of space-time results in a block-universe in which the past, present, and future “coexist”, a model that is static and devoid of any privileged "now". This view of time (or more precisely in this case, of space-time) is called*eternalism*. The demise of simultaneity entailed by Special Relativity provides strong grounds for pursuing the eternalist view, and this can be made consistent with quantum mechanics by rejecting the*single*world of the Copenhagen Interpretation in favour of the*many worlds*of the Everett “relative state” formulation. There is presently no means of testing whether there is a single world (with quantum systems being associated with irreducible probabilities) or, with each measurement on a quantum system, increasingly many worlds, and so subscribing to either one of these positions can only be a matter of prejudice.
In 1978 John Archibald Wheeler conceived a
thought-experiment (based on Young's two-slit experiment) that makes use of the
strange fact that the choice of experimental equipment determines whether light
exhibits wave-like properties or particle-like properties. The salient point
about the two-slit arrangement is that when the detector is of a kind that
ignores the paths by which the light reaches it, an interference pattern is
observed (indicating a wave-like nature consistent with Young’s version of the
experiment), but when the detector is of a kind that takes account of the path
(i.e. of which slit the light came through), no such pattern is observed
(indicating a particle-like nature consistent with Einstein’s work on the
photoelectric effect). Wheeler’s idea was to

*delay*the choice of detector until the light had already passed through (one or both of) the slits. The scenario has since been experimentally tested, finding that the choice of detector*still*determines whether the light exhibits a wave-like or particle-like nature at the detector. Wheeler commented that “one decides the photon shall have come by one route*or*by both routes*after it has already done its travel*” (italics added for emphasis).
In 1954 Alan Turing pointed out a
consequence of the statistics describing the decay of unstable quantum systems
like radioactive atoms. The probability of finding upon measurement that the
state has changed depends on the time interval between measurements, with more
frequent measurements reducing the probability of state change. In the limit,
where the measurement becomes continuous, the probability of state change
reduces to zero - i.e. the state

*never changes*. This effect was experimentally confirmed and given the name “Quantum Zeno Effect” by Sudarshan and Misra in 1977, and is often described using the adage that “a watched pot never boils”.
## No comments:

## Post a Comment