[“Quantum foundations” series]

Demons in physics? Well, historically as fanciful ways to explore theories using skilled marvels – an illuminating rather than malevolent context. Pure imagination, not imagineering, eh.

Physicist James Clerk Maxwell created a thought experiment in 1867. His idea involved a fantastical “finite being” able to sort molecules of a gas. Rarified acuity and dexterity, indeed! The topic was a hypothetical violation of the second law of thermodynamics (regarding entropy of isolated systems). His marvel became known as “Maxwell’s demon.”

Fact checker rating: No violation of the second law.

This thought experiment has provoked debate and theoretical work on the relation between

thermodynamics and information theoryextending to the present day, with a number of scientists arguing that theoretical considerations rule out any practical device violating the second law in this way. – Wiki

But prior to Maxwell’s imaginative inquiry, French scholar Pierre-Simon Laplace had introduced his own expository entity in 1814 – Laplace’s demon. The topic was whether the laws of physics entailed determinism, a deterministic universe (in principle).

According to determinism, if someone (the demon) knows the precise location and momentum of every atom in the universe [a consummate snapshot], their past AND future values for any given time … can be calculated from the laws of classical mechanics. – Wiki

Fact checker rating …

Although a lively topic in college philosophy and theology, for decades I’ve assumed that statistical mechanics and chaos theory (aka the butterfly effect) put determinism to rest. Particularly after reading more and more about the challenges of even the 3-body problem in classical physics. And indeterminacy in quantum physics even more so.

According to chemical engineer Robert Ulanowicz, in his 1986 book Growth and Development,

Laplace’s demonmet its end with early 19th century developments of the concepts of irreversibility, entropy, and the second law of thermodynamics. – Wiki

So, predicating determinism (as often contrasted with “free will”) on classical physics seemed like a hype. An over-the-top argument. Steampunk hubris – **the hypothesis that all details could be known to infinite precision** – perfect knowledge. (While the debate over reductionism was another matter.)

But the notion lives on. Physicist Sean Carroll invoked Laplace’s demon recently. Both for classical and quantum physics. “Laplace’s demon was never supposed to be a practical thought experiment … **But in principle, Newtonian mechanics is deterministic.**” [1]

And for quantum physics in the sense that wave functions are deterministic. With the notion of a universal wave function (akin to Laplace’s “single formula”): “**we can know the wave function exactly.**” His position aligns with advocacy of the many-worlds interpretation.

… many-worlds … is my personal favorite approach to quantum mechanics [Carroll: the simplest formulation of all the alternatives.], but it’s also the one for which it is most challenging to pinpoint how and why probability enters the game.

In many-worlds, we can know the wave function exactly, and it evolves deterministically. There is nothing unknown or unpredictable. Laplace’s demon could predict the entire future of the universe with perfect confidence.[1]

So, as Wiki notes, what’s a fact checker to do, eh.

The

interpretation of quantum mechanicsis still very much open for debate and there are many who take opposing views (such as the Many Worlds Interpretation and the de Broglie–Bohm interpretation). – Wiki

Anyway, that’s why this New Scientist article (below) caught my attention. The intellect or “super-powerful calculating intelligence” of Laplace’s demon has morphed into a supercomputer. And the debate is about a question of computing capacity and power.

There has recently been proposed a limit on the computational power of the universe, i.e.

the ability of Laplace’s demon to process an infinite amount of information. The limit is based on the maximum entropy of the universe, the speed of light, and the minimum amount of time taken to move information across thePlanck length, and the figure was shown to be about 10^120 bits. Accordingly, anything that requires more than this amount of data cannot be computed in the amount of time that has elapsed so far in the universe. – Wiki

New Scientist > Physics > “Even a computer the size of the universe can’t predict everything” by Leah Crane (31 March 2020).

The article focuses on the issue of computational **precision**. Nature itself presents a limit, namely, the **Planck length**: “the most powerful computer that could ever exist can’t simulate below the Planck length.”

The

three-body problem, which is the mathematical question of how three objects orbit one another according to Newton’s laws of motion, is notoriously difficult to solve because of a property calledchaos. A chaotic system is one in which even a tiny change in the initial conditions of the objects, like their positions or speeds, has an enormous effect on how they move over time [both forward and backward].Much of that difficulty comes from the fact that our computers have limited precision, so even tiny uncertainties can ruin a simulation of a chaotic system. But now Tjarda Boekholt at the University of Coimbra in Portugal and his colleagues say that even the best possible computer in the universe can’t solve this problem.

His team used extraordinarily precise simulations to probe whether a lack of precision is the only problem with predicting chaotic systems.

They started a simulation of three black holes orbiting each other at a distance of one parsec, or about 3 light years. They let it run for a while, and then tried to rewind it back to its initial configuration. They repeated this process 1212 times.

If it was impossible to rewind back to the initial configuration of the system, that would mean the system was unpredictable. The team used their results to calculate just how precise you would need to be in order to return to the initial configuration. They found that for about 5 per cent of triple systems, you would need to measure that configuration to a precision of less than a single Planck length – the smallest possible unit of measurement for length, and about 10^-51 times the initial distance between the black holes.

That means those systems are deeply unpredictable. “Even if you have a Planck length difference, which is a ridiculously small amount, some situations are still

irreversible,” says Boekholt. “We can’t go more precise than nature.”In a practical sense, this means that there is a limit to our predictive power when we try to examine the universe precisely, because even the most powerful computer that could ever exist can’t simulate below the Planck length.

So, where does that leave the debate? **Is Laplace’s demon bound by such limits?** What credence is there in positing a demon with perfect knowledge of the universe to infinite precision? When is concluding something about the universe as actual “in principle” (theoretically possible) just bogus physics and bogus metaphysics?

Personally I view even contemporary physics (the Standard Model, etc.) as approximate, even as Newtonian physics was approximate. So, a claim like “we can know the wave function exactly” is not completely accurate.

##### Notes

[1] Quanta Magazine > “Where Quantum Probability Comes From” by Sean Carroll, Contributing Columnist (September 9, 2019) > There are many different ways to think about probability. Quantum mechanics embodies them all.

Laplace’s demon[as portrayed in A Philosophical Essay on Probabilities (1814) by Pierre-Simon Laplace] was never supposed to be a practical thought experiment;the imagined intelligence would have to be essentially as vast as the universe itself.And in practice, chaotic dynamics can amplify tiny imperfections in the initial knowledge of the system into complete uncertainty later on.But in principle, Newtonian mechanics is deterministic.

See also:

The

French mathematician Pierre-Simon Laplacepointed out a profound implication of the classical mechanics way of thinking.In principle, a vast intellect could know the state of literally every object in the universe, from which it could deduce everything that would happen in the future, as well as everything that had happened in the past.Laplace’s demon is a thought experiment, not a realistic project for an ambitious computer scientist, but the implications of the thought experiment are profound.Newtonian mechanics describes a deterministic, clockwork universe.– Carroll, Sean. Something Deeply Hidden. Penguin Publishing Group. Kindle Edition.

So, here’s another take on putting Laplace’s demon to rest. Another case for the notion of “infinite precision” as a rabbit hole. And a connection with the ancient debate over numbers (especially irrational numbers) and realism. Even the black hole information paradox.

The article includes a reference to the weather and chaos theory; and how textbooks yet claim that weather as a classical system allows “in principle” deterministic forecasts – extended extrapolations from (~infinitely) precise measurement of current conditions (phase spaces).

And mention of even projecting that idea to the universe “with the initial state of every single particle encoded with infinitely many digits of precision.”

Not so fast! What about radioactive decay?

So, is time scripted or unfold creatively (flow)? Physics says what? Our experience says what? A feeling that “the present is thick.”

Quanta Magazine > Theoretical Physics > “Does Time Really Flow? New Clues Come From a Century-Old Approach to Math” by Natalie Wolchover (April 7, 2020) –

The laws of physics imply that the passage of time is an illusion. To avoid this conclusion, we might have to rethink the reality of infinitely precise numbers.Well, language is important, the words we use to describe reality matter. Classical physics or our everyday experience is overly imbued with a script of particles which can be arbitrarily localized, somehow “frozen” in time.

Waves, on the other hand, are inherently fuzzy. “Thick” – non-localized (e.g., the “tail problem”). Even superpositions of many waves. (Does a superposition of an infinite number of waves make sense?)

Related post: Whence the arrow of time?

In his latest book, theoretical physicist Lee Smolin characterizes “the basic ambition of physics” as the ability to predict the future. At least classically, eh.

In that ambition, there are a number of concepts:

• An infinitely brief momentary pause or magical suspension of the change in position and direction of all “atoms.”

• A perfectly complete instantaneous snapshot of that moment.

• Perfect encoding of all information in that snapshot as a “state.”

• Complete universal laws of physics – formulas which describe how all those “atoms” change in time (from one state to another).

• Using those formulas, a perfectly precise computation of the transformation of an initial state to a future one – a perfect input-output engine.

• The validity of reductionism – so that the determination of all atomic parts completely determines the action of the macroscopic aggregates – like coffee cups, baseballs, chairs, etc. (and perhaps living creatures, eh).

Such is the model (paradigm) of determinism (in principle).

But here’s the kicker – where the hubris of steampunk physics lies: Successful prediction (in whatever qualified sense for the case at hand) validates that model.

Here’s another take on Gisin’s paper.

The Next Web > “New math theory suggests time travel is impossible” by Tristan Greene (April 9, 2020).