[“Quantum foundations” series]
Demons in physics? Well, historically as fanciful ways to explore theories using skilled marvels – an illuminating rather than malevolent context. Pure imagination, not imagineering, eh.
Physicist James Clerk Maxwell created a thought experiment in 1867. His idea involved a fantastical “finite being” able to sort molecules of a gas. Rarified acuity and dexterity, indeed! The topic was a hypothetical violation of the second law of thermodynamics (regarding entropy of isolated systems). His marvel became known as “Maxwell’s demon.”
Fact checker rating: No violation of the second law.
This thought experiment has provoked debate and theoretical work on the relation between thermodynamics and information theory extending to the present day, with a number of scientists arguing that theoretical considerations rule out any practical device violating the second law in this way. – Wiki
But prior to Maxwell’s imaginative inquiry, French scholar Pierre-Simon Laplace had introduced his own expository entity in 1814 – Laplace’s demon. The topic was whether the laws of physics entailed determinism, a deterministic universe (in principle).
According to determinism, if someone (the demon) knows the precise location and momentum of every atom in the universe [a consummate snapshot], their past AND future values for any given time … can be calculated from the laws of classical mechanics. – Wiki
Fact checker rating …
Although a lively topic in college philosophy and theology, for decades I’ve assumed that statistical mechanics and chaos theory (aka the butterfly effect) put determinism to rest. Particularly after reading more and more about the challenges of even the 3-body problem in classical physics. And indeterminacy in quantum physics even more so.
According to chemical engineer Robert Ulanowicz, in his 1986 book Growth and Development, Laplace’s demon met its end with early 19th century developments of the concepts of irreversibility, entropy, and the second law of thermodynamics. – Wiki
So, predicating determinism (as often contrasted with “free will”) on classical physics seemed like a hype. An over-the-top argument. Steampunk hubris – the hypothesis that all details could be known to infinite precision – perfect knowledge. (While the debate over reductionism was another matter.)
But the notion lives on. Physicist Sean Carroll invoked Laplace’s demon recently. Both for classical and quantum physics. “Laplace’s demon was never supposed to be a practical thought experiment … But in principle, Newtonian mechanics is deterministic.” 
And for quantum physics in the sense that wave functions are deterministic. With the notion of a universal wave function (akin to Laplace’s “single formula”): “we can know the wave function exactly.” His position aligns with advocacy of the many-worlds interpretation.
… many-worlds … is my personal favorite approach to quantum mechanics [Carroll: the simplest formulation of all the alternatives.], but it’s also the one for which it is most challenging to pinpoint how and why probability enters the game.
In many-worlds, we can know the wave function exactly, and it evolves deterministically. There is nothing unknown or unpredictable. Laplace’s demon could predict the entire future of the universe with perfect confidence. 
So, as Wiki notes, what’s a fact checker to do, eh.
The interpretation of quantum mechanics is still very much open for debate and there are many who take opposing views (such as the Many Worlds Interpretation and the de Broglie–Bohm interpretation). – Wiki
Anyway, that’s why this New Scientist article (below) caught my attention. The intellect or “super-powerful calculating intelligence” of Laplace’s demon has morphed into a supercomputer. And the debate is about a question of computing capacity and power.
There has recently been proposed a limit on the computational power of the universe, i.e. the ability of Laplace’s demon to process an infinite amount of information. The limit is based on the maximum entropy of the universe, the speed of light, and the minimum amount of time taken to move information across the Planck length, and the figure was shown to be about 10^120 bits. Accordingly, anything that requires more than this amount of data cannot be computed in the amount of time that has elapsed so far in the universe. – Wiki
New Scientist > Physics > “Even a computer the size of the universe can’t predict everything” by Leah Crane (31 March 2020).
The article focuses on the issue of computational precision. Nature itself presents a limit, namely, the Planck length: “the most powerful computer that could ever exist can’t simulate below the Planck length.”
The three-body problem, which is the mathematical question of how three objects orbit one another according to Newton’s laws of motion, is notoriously difficult to solve because of a property called chaos. A chaotic system is one in which even a tiny change in the initial conditions of the objects, like their positions or speeds, has an enormous effect on how they move over time [both forward and backward].
Much of that difficulty comes from the fact that our computers have limited precision, so even tiny uncertainties can ruin a simulation of a chaotic system. But now Tjarda Boekholt at the University of Coimbra in Portugal and his colleagues say that even the best possible computer in the universe can’t solve this problem.
His team used extraordinarily precise simulations to probe whether a lack of precision is the only problem with predicting chaotic systems.
They started a simulation of three black holes orbiting each other at a distance of one parsec, or about 3 light years. They let it run for a while, and then tried to rewind it back to its initial configuration. They repeated this process 1212 times.
If it was impossible to rewind back to the initial configuration of the system, that would mean the system was unpredictable. The team used their results to calculate just how precise you would need to be in order to return to the initial configuration. They found that for about 5 per cent of triple systems, you would need to measure that configuration to a precision of less than a single Planck length – the smallest possible unit of measurement for length, and about 10^-51 times the initial distance between the black holes.
That means those systems are deeply unpredictable. “Even if you have a Planck length difference, which is a ridiculously small amount, some situations are still irreversible,” says Boekholt. “We can’t go more precise than nature.”
In a practical sense, this means that there is a limit to our predictive power when we try to examine the universe precisely, because even the most powerful computer that could ever exist can’t simulate below the Planck length.
So, where does that leave the debate? Is Laplace’s demon bound by such limits? What credence is there in positing a demon with perfect knowledge of the universe to infinite precision? When is concluding something about the universe as actual “in principle” (theoretically possible) just bogus physics and bogus metaphysics?
Personally I view even contemporary physics (the Standard Model, etc.) as approximate, even as Newtonian physics was approximate. So, a claim like “we can know the wave function exactly” is not completely accurate.
 Quanta Magazine > “Where Quantum Probability Comes From” by Sean Carroll, Contributing Columnist (September 9, 2019) > There are many different ways to think about probability. Quantum mechanics embodies them all.
Laplace’s demon [as portrayed in A Philosophical Essay on Probabilities (1814) by Pierre-Simon Laplace] was never supposed to be a practical thought experiment; the imagined intelligence would have to be essentially as vast as the universe itself. And in practice, chaotic dynamics can amplify tiny imperfections in the initial knowledge of the system into complete uncertainty later on. But in principle, Newtonian mechanics is deterministic.
The French mathematician Pierre-Simon Laplace pointed out a profound implication of the classical mechanics way of thinking. In principle, a vast intellect could know the state of literally every object in the universe, from which it could deduce everything that would happen in the future, as well as everything that had happened in the past. Laplace’s demon is a thought experiment, not a realistic project for an ambitious computer scientist, but the implications of the thought experiment are profound. Newtonian mechanics describes a deterministic, clockwork universe. – Carroll, Sean. Something Deeply Hidden. Penguin Publishing Group. Kindle Edition.