Book · General · Language

Laplace’s demon RIP? – demons of physics

[“Quantum foundations” series]

Demons in physics? Well, historically as fanciful ways to explore theories using skilled marvels – an illuminating rather than malevolent context. Pure imagination, not imagineering, eh.

Physicist James Clerk Maxwell created a thought experiment in 1867. His idea involved a fantastical “finite being” able to sort molecules of a gas. Rarified acuity and dexterity, indeed! The topic was a hypothetical violation of the second law of thermodynamics (regarding entropy of isolated systems). His marvel became known as “Maxwell’s demon.”

Fact checker rating: No violation of the second law.

This thought experiment has provoked debate and theoretical work on the relation between thermodynamics and information theory extending to the present day, with a number of scientists arguing that theoretical considerations rule out any practical device violating the second law in this way. – Wiki

But prior to Maxwell’s imaginative inquiry, French scholar Pierre-Simon Laplace had introduced his own expository entity in 1814 – Laplace’s demon. The topic was whether the laws of physics entailed determinism, a deterministic universe (in principle).

According to determinism, if someone (the demon) knows the precise location and momentum of every atom in the universe [a consummate snapshot], their past AND future values for any given time … can be calculated from the laws of classical mechanics. – Wiki

Fact checker rating …

Although a lively topic in college philosophy and theology, for decades I’ve assumed that statistical mechanics and chaos theory (aka the butterfly effect) put determinism to rest. Particularly after reading more and more about the challenges of even the 3-body problem in classical physics. And indeterminacy in quantum physics even more so.

According to chemical engineer Robert Ulanowicz, in his 1986 book Growth and Development, Laplace’s demon met its end with early 19th century developments of the concepts of irreversibility, entropy, and the second law of thermodynamics. – Wiki

So, predicating determinism (as often contrasted with “free will”) on classical physics seemed like a hype. An over-the-top argument. Steampunk hubris – the hypothesis that all details could be known to infinite precision – perfect knowledge. (While the debate over reductionism was another matter.)

But the notion lives on. Physicist Sean Carroll invoked Laplace’s demon recently. Both for classical and quantum physics. “Laplace’s demon was never supposed to be a practical thought experiment … But in principle, Newtonian mechanics is deterministic.” [1]

And for quantum physics in the sense that wave functions are deterministic. With the notion of a universal wave function (akin to Laplace’s “single formula”): “we can know the wave function exactly.” His position aligns with advocacy of the many-worlds interpretation.

… many-worlds … is my personal favorite approach to quantum mechanics [Carroll: the simplest formulation of all the alternatives.], but it’s also the one for which it is most challenging to pinpoint how and why probability enters the game.

In many-worlds, we can know the wave function exactly, and it evolves deterministically. There is nothing unknown or unpredictable. Laplace’s demon could predict the entire future of the universe with perfect confidence. [1]

So, as Wiki notes, what’s a fact checker to do, eh.

The interpretation of quantum mechanics is still very much open for debate and there are many who take opposing views (such as the Many Worlds Interpretation and the de Broglie–Bohm interpretation). – Wiki

Anyway, that’s why this New Scientist article (below) caught my attention. The intellect or “super-powerful calculating intelligence” of Laplace’s demon has morphed into a supercomputer. And the debate is about a question of computing capacity and power.

There has recently been proposed a limit on the computational power of the universe, i.e. the ability of Laplace’s demon to process an infinite amount of information. The limit is based on the maximum entropy of the universe, the speed of light, and the minimum amount of time taken to move information across the Planck length, and the figure was shown to be about 10^120 bits. Accordingly, anything that requires more than this amount of data cannot be computed in the amount of time that has elapsed so far in the universe. – Wiki

New Scientist > Physics > “Even a computer the size of the universe can’t predict everything” by Leah Crane (31 March 2020).

The article focuses on the issue of computational precision. Nature itself presents a limit, namely, the Planck length: “the most powerful computer that could ever exist can’t simulate below the Planck length.”

The three-body problem, which is the mathematical question of how three objects orbit one another according to Newton’s laws of motion, is notoriously difficult to solve because of a property called chaos. A chaotic system is one in which even a tiny change in the initial conditions of the objects, like their positions or speeds, has an enormous effect on how they move over time [both forward and backward].

Much of that difficulty comes from the fact that our computers have limited precision, so even tiny uncertainties can ruin a simulation of a chaotic system. But now Tjarda Boekholt at the University of Coimbra in Portugal and his colleagues say that even the best possible computer in the universe can’t solve this problem.

His team used extraordinarily precise simulations to probe whether a lack of precision is the only problem with predicting chaotic systems.

They started a simulation of three black holes orbiting each other at a distance of one parsec, or about 3 light years. They let it run for a while, and then tried to rewind it back to its initial configuration. They repeated this process 1212 times.

If it was impossible to rewind back to the initial configuration of the system, that would mean the system was unpredictable. The team used their results to calculate just how precise you would need to be in order to return to the initial configuration. They found that for about 5 per cent of triple systems, you would need to measure that configuration to a precision of less than a single Planck length – the smallest possible unit of measurement for length, and about 10^-51 times the initial distance between the black holes.

That means those systems are deeply unpredictable. “Even if you have a Planck length difference, which is a ridiculously small amount, some situations are still irreversible,” says Boekholt. “We can’t go more precise than nature.”

In a practical sense, this means that there is a limit to our predictive power when we try to examine the universe precisely, because even the most powerful computer that could ever exist can’t simulate below the Planck length.

So, where does that leave the debate? Is Laplace’s demon bound by such limits? What credence is there in positing a demon with perfect knowledge of the universe to infinite precision? When is concluding something about the universe as actual “in principle” (theoretically possible) just bogus physics and bogus metaphysics?

Personally I view even contemporary physics (the Standard Model, etc.) as approximate, even as Newtonian physics was approximate. So, a claim like “we can know the wave function exactly” is not completely accurate.


[1] Quanta Magazine > “Where Quantum Probability Comes From” by Sean Carroll, Contributing Columnist (September 9, 2019) > There are many different ways to think about probability. Quantum mechanics embodies them all.

Laplace’s demon [as portrayed in A Philosophical Essay on Probabilities (1814) by Pierre-Simon Laplace] was never supposed to be a practical thought experiment; the imagined intelligence would have to be essentially as vast as the universe itself. And in practice, chaotic dynamics can amplify tiny imperfections in the initial knowledge of the system into complete uncertainty later on. But in principle, Newtonian mechanics is deterministic.

See also:

The French mathematician Pierre-Simon Laplace pointed out a profound implication of the classical mechanics way of thinking. In principle, a vast intellect could know the state of literally every object in the universe, from which it could deduce everything that would happen in the future, as well as everything that had happened in the past. Laplace’s demon is a thought experiment, not a realistic project for an ambitious computer scientist, but the implications of the thought experiment are profound. Newtonian mechanics describes a deterministic, clockwork universe. – Carroll, Sean. Something Deeply Hidden. Penguin Publishing Group. Kindle Edition.

Related posts

Whence the arrow of time?

Quantum reality, quantum worlds

6 thoughts on “Laplace’s demon RIP? – demons of physics

  1. So, here’s another take on putting Laplace’s demon to rest. Another case for the notion of “infinite precision” as a rabbit hole. And a connection with the ancient debate over numbers (especially irrational numbers) and realism. Even the black hole information paradox.

    The article includes a reference to the weather and chaos theory; and how textbooks yet claim that weather as a classical system allows “in principle” deterministic forecasts – extended extrapolations from (~infinitely) precise measurement of current conditions (phase spaces).

    And mention of even projecting that idea to the universe “with the initial state of every single particle encoded with infinitely many digits of precision.”

    Not so fast! What about radioactive decay?

    Still, other popular interpretations of quantum mechanics, including the many-worlds interpretation, manage to keep the classical, deterministic notion of time alive.

    So, is time scripted or unfold creatively (flow)? Physics says what? Our experience says what? A feeling that “the present is thick.”

    Quanta Magazine > Theoretical Physics > “Does Time Really Flow? New Clues Come From a Century-Old Approach to Math” by Natalie Wolchover (April 7, 2020) – The laws of physics imply that the passage of time is an illusion. To avoid this conclusion, we might have to rethink the reality of infinitely precise numbers.

    In Albert Einstein’s theory of relativity … time is woven together with the three dimensions of space, forming a bendy, four-dimensional space-time continuum — a “block universe” encompassing the entire past, present and future.

    Physicists who think carefully about time point to troubles posed by quantum mechanics, … This apparent inconsistency between the nature of time in quantum mechanics and the way it functions in relativity has created uncertainty and confusion.

    Over the past year, the Swiss physicist Nicolas Gisin has published four papers that attempt to dispel the fog surrounding time in physics. As Gisin sees it, the problem all along has been mathematical. Gisin argues that time in general and the time we call the present are easily expressed in a century-old mathematical language called intuitionist mathematics, which rejects the existence of numbers with infinitely many digits. When intuitionist math is used to describe the evolution of physical systems, it makes clear, according to Gisin, that “time really passes and new information is created.” Moreover, with this formalism, the strict determinism implied by Einstein’s equations gives way to a quantum-like unpredictability. If numbers are finite and limited in their precision, then nature itself is inherently imprecise, and thus unpredictable.

    It was on a Sunday about two and a half years ago that he realized that the deterministic picture of time in Einstein’s theory and the rest of “classical” physics implicitly assumes the existence of infinite information.

    The block universe, which implicitly assumes the existence of infinite information, must fall apart.

    “Our two big theories on physics, quantum theory and general relativity, make different statements,” said Renner [a quantum physicist at the Swiss Federal Institute of Technology Zurich]. He and several other physicists said this inconsistency underlies the struggle to find a quantum theory of gravity — a description of the quantum origin of space-time — and to understand why the Big Bang happened. “If I look at where we have paradoxes and what problems we have, in the end they always boil down to this notion of time.

    Well, language is important, the words we use to describe reality matter. Classical physics or our everyday experience is overly imbued with a script of particles which can be arbitrarily localized, somehow “frozen” in time.

    Waves, on the other hand, are inherently fuzzy. “Thick” – non-localized (e.g., the “tail problem”). Even superpositions of many waves. (Does a superposition of an infinite number of waves make sense?)

    Related post: Whence the arrow of time?

  2. In his latest book, theoretical physicist Lee Smolin characterizes “the basic ambition of physics” as the ability to predict the future. At least classically, eh.

    It was hoped that this power would follow if only we could give the physical world a complete description. By [perfectly] describing fully the motion of every particle and the action of every force, we would be able to work out exactly what would happen in the future. – Smolin, Lee. Einstein’s Unfinished Revolution. Penguin Publishing Group. Kindle Edition.

    In that ambition, there are a number of concepts:

    • An infinitely brief momentary pause or magical suspension of the change in position and direction of all “atoms.”

    • A perfectly complete instantaneous snapshot of that moment.

    • Perfect encoding of all information in that snapshot as a “state.”

    • Complete universal laws of physics – formulas which describe how all those “atoms” change in time (from one state to another).

    • Using those formulas, a perfectly precise computation of the transformation of an initial state to a future one – a perfect input-output engine.

    • The validity of reductionism – so that the determination of all atomic parts completely determines the action of the macroscopic aggregates – like coffee cups, baseballs, chairs, etc. (and perhaps living creatures, eh).

    Such is the model (paradigm) of determinism (in principle).

    The hypothesis that the future is completely determined by the laws of physics acting on the present configuration of the world is called determinism. This is an extraordinarily powerful idea, whose influence can be seen in diverse fields. If you appreciate the extent to which determinism dominated thought in the nineteenth century, you can begin to understand the revolutionary impact of quantum mechanics across all fields, because quantum mechanics precludes determinism. – Ibid

    But here’s the kicker – where the hubris of steampunk physics lies: Successful prediction (in whatever qualified sense for the case at hand) validates that model.

    A successful prediction of the future state is taken as a validation of that explanation. … This confirms a belief that the information that went into describing the state is in fact a complete description of the world at one moment of time. – Ibid.

  3. Here’s another take on Gisin’s paper.

    The Next Web > “New math theory suggests time travel is impossible” by Tristan Greene (April 9, 2020).

    A physicist named Nicolas Gisin from the University of Geneva recently published a series of papers that could change our entire view on the concept of “time.”

    Gisin’s work attempts to reconcile modern-day quantum mechanics theory with an alternative math theory developed by Dutch mathematician Luitzen Egbertus Jan Brouwer in the early 20th century called “intuitionistic mathematics.”

    … Brouwer’s theory … is complex, but it’s biggest drawing point is that it removes the mathematical need for something called the “excluded middle.”

    [Image] A diagram from Gisin’s paper describing the relation between indeterministic physics intuistionistic mathematics.
    Credit: Nicolas Gisin
    Credit: Nicolas Gisin

    Those in Einstein‘s camp got around the lack of an explanation for ‘now’ in physics by adding infinities to their math. If you, for example, assume a sequence goes on infinitely, you can bend space-time theories to demonstrate a singular, infinite continuum that exists like a giant vinyl record where we, the observers, are the needle.

  4. How odd that someone else had the same idea for an article, eh? Including Descartes’ demon, Laplace’s demon, Maxwell’s demon, Bohm’s demon, Searle’s demon, Darwin’s demon.

    • The New Yorker > “Science’s Demons, from Descartes to Darwin and Beyond” by Casey Cep (January 8, 2021) – How supernatural conceptions have advanced our understanding of the natural universe.

    [Historian of science Jimena Canales’ book] “Bedeviled: A Shadow History of Demons in Science” (Princeton University Press) is not a survey of Baal, Stolas, Volac, and their kin. Instead, Canales has gathered together in one book demons with very different origins and responsibilities – among them the scientist James Clerk Maxwell’s demon, the physicist David Bohm’s demon, the philosopher John Searle’s demon, and the naturalist Charles Darwin’s demon. … They are not supernatural creatures; rather, they are particular kinds of thought experiments, placeholders of sorts for laws or theories or concepts not yet understood.

    … but, in general, scientific demons simply represent gaps in our existing knowledge, anthropomorphic accounts of the unknown or the unexplained. … because they often represent theories that violate laws of time or space, machines that exceed human capacities, and technologies that threaten human existence, the use of demons feels appropriate …

    But a crucial difference, as Canales points out, is that, though religious demons are believed to be real until proved otherwise, scientific demons are presumed imaginary until someone proves they are real. They come and go as science advances, moving from one laboratory or notebook to another as they are explained or explained away.

  5. • “Grid pandemonium overwhelms Laplace’s demon.” – Wilczek, Frank. The Lightness of Being (p. 119). Basic Books. Kindle Edition.

  6. Chaos theory is about more than the complexity of a system. Many dynamic systems “are extremely sensitive to initial conditions” and reveal the limits of predicting the evolution of these systems – despite being governed by deterministic physical laws and exhibiting patterns. Such systems expose the conceit of Laplace’s demon – that you can “know exactly, precisely, to the infinite decimal point the state of the system.” [1]

    • > “Chaos theory explained: A deep dive into an unpredictable universe” by Paul Sutter (March 18, 2022) – Chaos theory is why we will never be able to perfectly predict dynamic systems like the weather.

    (image caption) The term “The Butterfly Effect” was coined by Edward Lorenz [1972] to help describe the complex idea of chaos theory. It describes how a very small change in the initial state [of a deterministic nonlinear system] can result in large differences in a later state. Lorenz described this effect with the analogy of a butterfly flapping its wings and causing the formation of a hurricane miles away. (Image credit: tovfla via Getty Images)

    It turns out, though, that nature can be both deterministic and unpredictable. We first got hints of this way back in the 1800s, when the king of Sweden offered a prize to anyone who could solve the so-called three-body problem.

    French mathematician Henri Poincaré … won the prize … , describing all the reasons why it couldn’t be solved. One of the most important reasons he highlighted was how small differences at the beginning of the system would lead to big differences at the end.

    … [in] the mid-20th century … mathematician Edward Lorenz [found that] one tiny rounding error, no more than 1 part in a million, would lead to the completely different behavior of the weather in his model. … the signature sign [hallmark] of a chaotic system.


    Phase space


    Strange attractors



    [1] Wiki has an animated example of one of the simplest dynamical systems with chaotic solutions – the double-rod pendulum.

    Although no universally accepted mathematical definition of chaos exists, a commonly used definition, originally formulated by Robert L. Devaney, says that to classify a dynamical system as chaotic, it must have these properties:

    • it must be sensitive to initial conditions,
    • it must be topologically transitive,
    • it must have dense periodic orbits.

    Long exposure photo
    (image credit) George Ioannidis – (as is) own work under Creative Commons Attribution

Comments are closed.