[Draft] [“Beyond the Standard Model” series]
Background on the “crisis”
This Quanta Magazine article (below) has an eye-catching title, but its gist relates to the hierarchy problem, which I discussed in a prior post. That 2017 post (and additionaL commentary) used quotes by physicists – Sean Carroll, Leon Lederman, Fermilab’s Don Lincoln (video) – and Wiki to cover the context and vocabulary of the “crisis” discussed here. As well as the common mathematical techniques involved.
Rather than adding notes on this recent article as a comment to my old post, I decided that a new post was appropriate, as a follow-up to the recent post “Beyond the Standard Model – sliver of reality?“
• Quanta Magazine > “A Deepening Crisis Forces Physicists to Rethink Structure of Nature’s Laws” by Natalie Wolchover, Senior Writer/Editor (March 1, 2022) – In a slew of recent papers, researchers have thrown reductionism to the wind.
I’m not sure that starting her article by discussing Thomas Kuhn’s book helps Wolchover’s case. However, expectations for the world’s great particle colliders (the Large Hadron Collider) and mathematical models (supersymmetry, string theory, etc.) – to resolve some major puzzles – have yet to be fulfilled.
Wiki: The heuristic rule that parameters in a fundamental physical theory should not be too fine-tuned is called naturalness.
So, revisiting assumptions makes sense. Seeking something more profound. Particularly regarding the mass (energy scale) of the Higgs boson. Tallying self-interactions. (There’s a useful graphic in the article.) And so-called vacuum energy – its energy scale also.
Researchers are increasingly zeroing in on what they see as a weakness in the conventional reasoning about naturalness. It rests on a seemingly benign assumption, one that has been baked into scientific outlooks since ancient Greece: Big stuff consists of smaller, more fundamental stuff — an idea known as reductionism.
They’re exploring novel ways in which big and small distance scales might conspire, producing values of parameters that look unnaturally fine-tuned from a reductionist perspective.
Looking at the problem in a new way maybe … building on the notion of effective theories (effective field theory). Practical ignorance of detail.
Physicists refer to low-energy, long-distance physics as “the IR,” and high-energy, short-distance physics as “the UV,” drawing an analogy with infrared and ultraviolet wavelengths of light.
You can, for instance, model water with a hydrodynamic equation that treats it as a smooth fluid, glossing over the complicated dynamics of its H2O molecules. The hydrodynamic equation includes a term representing water’s viscosity — a single number, which can be measured at IR scales, that summarizes all those molecular interactions happening in the UV.
Wolchover cites the success of applying effective field theory (EFT) to predicting the mass of the charm quark. Cutoff energies for applying a model. High vs. low-energy corrections (to avoid infinities).
A cutoff not far above the mass of the Higgs boson itself would make the Higgs about as heavy as the corrections coming from the cutoff, and everything would look natural.
And then revisiting gravity, which “doesn’t play by the usual reductionist rules.” As in black holes, where: “More energy no longer lets you see shorter distances.”
quantum gravity seems to toy with nature’s architecture, making a mockery of the neat system of nested scales that EFT-wielding physicists have grown accustomed to.
UV-IR mixing potentially resolves naturalness problems by breaking EFT’s reductionist scheme.
[Nathaniel Craig, a theoretical physicist at UCSB] “Gravity violates the normal EFT reasoning because it mixes physics at all length scales — short distances, long distances.
Another perspective: The observable universe as a particle box with limited number of high-energy particle states. Dependent on the surface area rather than volume, so there’s “far less high-energy activity than the EFT calculation assumes.”
That means the usual EFT calculation of the cosmological constant [an IR property of the whole universe] is too naive.
Of course, another approach uses string theory – for UV-IR mixing. Correlations which cancel infinities.
The hierarchy problem, in this context, asks why corrections from these string states don’t inflate the Higgs, if there’s nothing like supersymmetry to protect it.
The new models represent a growing grab bag of UV-IR mixing ideas.
And the quest for a quantum gravity model continues. Experimental evidence? Testable predictions? Perhaps “the whole UV-IR mixing concept lacks promise” – a promise of a paradigm shift, eh.
The Standard Model stands.
So, is there a crisis in physics? An impending paradigm shift?
What qualifies as a paradigm shift? Changes in core concepts. A new conceptual and mathematical framework. Reconsideration of exemplars and shared preconceptions. A new way of viewing reality which reconciles anomalies. Something akin to a (gestalt-like) flip in perception of an ambiguous image.
Wilczek’s parable of intelligent deepwater fish might qualify – figuring out that they are not living in empty space but in a medium called water.
Otherwise, Wiki cites some of the “classical cases” of Kuhnian paradigm shifts in science.
The Standard Model of physics is cited as an example of a currently accepted paradigm. As well as moving beyond Newtonian gravity.
The many contributions problem
Historically, the adoption of a heliocentric model of the solar system over a geocentric one is cited as a paradigm shift. Happening over decades “through a complex social process.” Not merely a shift in coordinate systems. Not merely a fine tuning of predictive calculations. Not merely an alternative explanation for everyday experience.
The geocentric Ptolemaic system entailed epicycles, circles moving on other circles, in order to align with astronomical observables – motions in the heavens. Predictions involved tallying those additional geometric contributions.
Wiki: Epicycles worked very well and were highly accurate, because, as Fourier analysis later showed, any smooth curve can be approximated to arbitrary accuracy with a sufficient number of epicycles.
But computational complexity was not the only factor in competing models. And the “purity” of circular motion persisted.
As Wiki notes:
The geocentric model was eventually replaced by the heliocentric model. Copernican heliocentrism could remove Ptolemy’s epicycles because the retrograde motion could be seen to be the result of the combination of Earth and planet movement and speeds. Copernicus felt strongly that equants were a violation of Aristotelian purity, and proved that replacement of the equant with a pair of new epicycles was entirely equivalent. Astronomers often continued using the equants instead of the epicycles because the former was easier to calculate, and gave the same result.
It has been determined [by whom?], in fact, that the Copernican, Ptolemaic and even the Tychonic models provided identical results to identical inputs. They are computationally equivalent. It wasn’t until Kepler demonstrated a physical observation that could show that the physical sun is directly involved in determining an orbit that a new model was required.
The geocentric system was still held for many years afterwards, as at the time the Copernican system did not offer better predictions than the geocentric system, and it posed problems for both natural philosophy and scripture. The Copernican system was no more accurate than Ptolemy’s system, because it still used circular orbits. This was not altered until Johannes Kepler postulated that they were elliptical … .
Perhaps a paradigm shift – a new model – in quantum physics might entail something corresponding to:
- Supplanting “epicycles” (loop diagrams?),
- Supplanting purity / naturalness of “circles” ,
- Introducing another factor (“the physical sun” in the above quote) directly involved in determining quantum energies.
Some of my posts speculate that puzzles about the Higgs “mass” and vacuum energy might have to do with the vocabulary of “mass” and “charge,” for example. The notion that so-called point particles have such reducible aspects or properties. That such terms define intrinsic properties of so-called fundamental particles rather than topologically induced effects in energy fields and fluxes – effects which are measured as those properties.
Is “UV-IR mixing” a similar drift? Or a revised regularization scheme? Perspective flip or alternative reckoning? I’m not sure if “mixing” contains a sense of interactions between layers of an energy structure (Wilczek’s Grid), within a gestalt-like topology.
So, I’m re-visiting Wiki’s article on Naturalness (physics) regarding “Naturalness and the gauge hierarchy problem” for the Higgs boson mass.
- Effective field theory
- Cutoff scale (“cut-off to the divergent loop integrals”)
- Tallying independent contributions for an observable
- The divergent radiative correction (“blow up” or managing “divergences to all orders in perturbation theory”)
- Mixing and loop contributions
 Wolchover starts her article with a not uncommon go-to by discussing Thomas Kuhn’s seminal 1962 book The Structure of Scientific Revolutions. The book was a favorite study when I was in graduate school. Often cited still (through succeeding editions as well), my impression is that evolving critique of his book – particularly the notion of paradigm shift – muted its domain of applicability. As well as raised the possibility that a more complete consideration of the history of science might not require Kuhn’s “sharp distinction between paradigmatic and non-paradigmatic science.” As well as a more elusive sense of progress.
Wiki: According to a report released in 2014 by the National Science Foundation, 26% of Americans surveyed believe that the sun revolves around the Earth.
Wiki: According to Max Planck, “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
Wiki: Many philosophers and historians of science, including Kuhn himself, ultimately accepted a modified version of Kuhn’s model, which synthesizes his original view with the gradualist model that preceded it. Kuhn’s original model is now generally seen as too limited.
Personally, I wonder about paradigms and physics in the context of language. The habit of applying everyday language to discuss concepts and mathematical models disconnected wildly from human experience. Among others, Sean Carroll talks a lot of about this context of emergent layered descriptions of reality, each using its own appropriate vocabulary.
My issue is that “stacking” layers of vocabulary likely compromises our understanding at some point. And makes the only shared framework a mathematical one accessible to only an analytical elite.
So, I’m not sure that a deeper understanding of quantum theory, even replacing the vocabulary of “particle, force, mass, charge,” etc., will resolve outstanding problems in modern physics. Let alone entail a so-called paradigm shift. Even with higher-dimensional topologies.
 In the graphic, as a possible solution, is the idea that: “The Higgs scale and Planck scale are connected through a complex set of push-and-pull effects.” Replacing the “push & pull” metaphor with “interactions” between Grid layers (e.g., quantum field energy and vacuum energy) is more to my liking.
 Also, other departures from “perfection” akin to celestial flaws, mutability, and chaotic dynamics.
• Sisyphean hierarchy (April 13, 2017)