Do we really need new particle physics?
Ten years after finally observing the Higgs boson, the experimental silence is deafening.
Thomas Kuhn’s The Structure of Scientific Revolutions has made the “paradigm shift” into a trope for scientific progress. While this model of change certainly fits in retrospect, anticipating the next paradigm shift is not a well-defined problem. But this doesn’t stop folks from trying. Heady as they where, the early days of String Theory were thick with this revolutionary rhetoric.
I’m concerned that a strong emphasis on a complete upheaval of the status quo has distorted the public’s perspective on how Science - or at least Particle Physics - works in practice.
Hence this essay.
Before we discuss anything beyond the Standard Model of Particle Physics, it’s important to know the scope of our of current understanding is. Only then will we able to distinguish “revolutionary” ideas from more structurally benign, but still theoretically satisfying, “refinements”.
Working with Standard Model of Particle Physics
Paradigm shifts don’t happen because somebody found an error in the arithmetic. They aren’t holes poked in a physical model. Scientific models supported by experimental tests aren’t the same thing as a mathematical proof. One error does not constitute falsification. Scientific theories have an inertia built from the history of experiments and ideas that built it.
A single “off” measurement - like the recent CDF II analysis of the W-boson mass - is simply that. The Standard Model of Particle Physics isn’t a water-balloon waiting for someone with a needle to come by and burst our understanding. A giant concrete sphere is a better metaphor. Experimental troubles need to drill hard to make the model crack.
If anything, the main concern with particle physics these days is that the Standard Model agrees too well with experiment. There’s nothing fun for theorists to work on. The only practical things to do are grind and polish that concrete sphere1.
The Standard Model in Practice
The Standard Model of Particle Physics does an outstanding job of predicting how elementary particles interact. It sets the framework for understanding the microscopic scale for our universe. In addition to a taxonomy of individual particles, the Standard Model explains how those particles interact with one another, how they bind together to form other particles and provides consistency requirements that sharply constrain what what changes might be allowed.
Precision tests of Quantum Electrodynamics - the most familiar corner of the Standard Model - have aligned theory with experiment to ten parts in a billion!
Related to this, but perhaps less widely known is another prediction latent in the Standard Model: the electromagnetic force should get stronger at higher energies! The framework of quantum field theory relates the various parameters of the Standard Model to each other, and to the energies at which they interact. The masses of the different particles are intimately related to their charges, and visa versa.
These relationships vary with energy in a predictable way. When you smash electrons and positrons together with enough energy to produce a Z-boson, theory predicts that the value of their electric charges should be 25% larger. Precision experiments have directly verified this phenomena!
More subtle - but powerful - consistency checks are also available in the Standard Model. For example, the conservation of energy demands that the unstable Z-boson must convert all of its rest mass energy other, lighter particles upon decay. According to the Standard Model, Z can decay to a bunch of different electrically charged particles and also the electrically neutral neutrini. By averaging over as many decays as one can find, and adding up all the masses of all the charged particles observed in those decays, one can compute the expected contribution of neutrini to the Z-boson decays.
The fraction of neutrino decays is a precise number we can compute using the model. Theoretical calculations using the Standard Model depend primarily on the number of distinct species of neutrino. The Particle Data Group reports the (averaged) experimentally observed number of such particles at 2.92±0.05, which is consistent only with the three known “flavors” of neutrino: the electron neutrino, the muon neutrino and the tau neutrino.
If there were an as yet unobserved “4th neutrino” that was produced in Z-decays, the fraction of the Z’s rest mass energy devoted to charged particles would be lower than what we see. This is a precision-based consistency check on a qualitative structure of the standard model: the number of distinct neutrino species.
It gets better. Given that the Standard Model is organized into three generations of fermions, this also rules out the existence of a simple, 4th generation of particles.
The Standard Model in Theory
The biggest complaint about the Standard Model is also what makes it so difficult to communicate: it’s really complicated. The are quarks and leptons, force carrying bosons and and a vast assortment of interactions between them all. As discussed above, the properties of all these particles are interrelated and tied up with the energy at which they interact.
Including the effects of neutrino masses, the Standard Model has 26 parameters. (And somehow this model still has predictive power!) Noticeably absent among these is the mass of the proton.
The proton is a composite particle made up from three individual quarks. Amusingly, the masses of those quarks contribute almost nothing to the overall mass of the proton itself. The proton mass is generated dynamically - a direct manifestation of Einstein’s famous E=mc^2 formula. It’s made from subnuclear goo, like the gluons that communicate the strong nuclear force.
Because of the success explaining things like the proton mass using gluons, there is hope that some of the 26 parameters of the standard model can be similarly explained away by some physical dynamics. Given our modern understanding of particle physics, these parameters could be reinterpreted as effective parameters for some phenomena we do not yet understand.
In other words, the Standard Model is theoretically unsatisfying because it appears incomplete. But does this constitute a need for new physics, or is it merely an aesthetic complaint?
The Higgs Instability
Physicists tend to believe that they need new physics to work on2. New physics usually requires an unsolved problem. In theoretical particle physics these days, unsolved problems feel more like loose ends.
For theorists, at least one those loose ends - the mass of the Higgs Boson - comes with two servings of existential dread.
Unlike other parameters in the Standard Model, the mass of the Higgs boson appears too low. Far too low.
It’s arguably unscientific to talk qualitatively about what the mass of a particle “should be”. In the absence of a concrete model, this is ultimately a question for philosophers, not physicists. But when a physicist says the mass of a particle is “too low”, they typically have very precise concerns.
The Scope of Standard Model Parameters
In addition to finding the masses of particles, another thing physicists like to do is see how “generic” those masses are. Remember, the masses and other parameters of the Standard Model are interrelated. If nature changed one of them just a little bit, how would that tiny change effect the rest?
Ideally, a little change in a mass or charge of particle will have little effect on the result of the physics. If that’s true, then we have a pretty strong handle on how the physics works. When models are insensitive to tiny changes of parameters, we call them stable3.
To the chagrin of Philosophers everywhere, Physicists call such stable parameters “natural”.
The Higgs is not Natural
Quantum Mechanics allows particles like the Higgs to burst apart into different species and then collapse back into themselves. Rapidly. These blips or loops, as they’re called, involve virtual particles4 appearing and disappearing so quickly that they aren’t typically observable. What we see instead is the aggregate, statistical effect of this behavior.
Earlier we mentioned the variability of the electron’s charge with collision energy. This is an example of these quantum loop effects. The electrons are surrounded by a cloud of such virtual photons, electrons and positrons. The statistical effect of this cloud of virtual particles amounts to a screening of the true electric charge.
Particles with higher energy penetrate deeper into the electron’s cloud before recoiling away. The deeper a colliding particle gets through the shielding cloud, the more of the electron’s “bare” charge it gets to see. Hence the effective electric charge of the electron increases with collision energy.
Carrying that cloud of virtual particles around also affects the inertia of the electron. The mass, in other words, can also change as a statistical effect of these virtual particles.
The Higgs boson also has a cloud of virtual particles, and top quarks make up the largest fraction of that cloud. Top quarks are heavy and contribute a large amount to aggregated mass of the Higgs boson.
But here’s the thing: the Higgs plays an unusual role in the Standard Model of Particle Physics. Through its interactions, the Higgs bestows a mass to many of the fundamental particles like quarks and leptons. Because of that unusual role, the particles appearing in the Higgs’ virtual cloud have a mass that depends on the mass of the Higgs! All this gives rise to a peculiar feedback effect: The mass of the Higgs boson changes violently with collision energy!
The Higgs boson, in other words, is not technically natural.
Technical Naturalness, Stability and Fine Tuning
As we’ve discussed, the mass and interaction strengths between fundamental particles change depending on how hard you hit them. How dramatically they change with energy is related to how stable they are.
The typical size of the quantum corrections to the Higgs mass is enormous! The ratio of its physical mass to the quantum corrections is something like 1 part in 10^16.
A natural ratio - like almost all the other parameters of the Standard Model - would be closer to something like 1 part in 10.
Usually, unnatural values of physical parameters are explained by the existence of new particles. The charm quark was predicted to explain the otherwise unnaturally small decays between leptons.
Models based on Weak Scale Supersymmetry were exciting precisely because they were rich new in particles which could account for the fine tuning of the Higgs mass.
To date, there is no direct evidence for any such particles. And so the problem remains:
Why is the Higgs’ mass so small?
But for theoretical physicists, a more pressing question undergirds this one:
Is that question within the scope of Scientific Inquiry… at all?
We don’t yet know. We may never know. In any case, there’s more work to be done.
The Higgs Vacuum Metastability
This conundrum gets worse. As we’ve discussed, the Higgs gives a mass to other particles like electrons, muons and quarks through a convoluted set of particle interactions. But the Higgs boson also interacts with itself!
Through the same quantum loop effects, the strength of the Higgs’ self interaction becomes related to its mass, and therefore to the masses of all those other particles.
Physicist visualize these collective effects in terms of a potential energy: like a ball rolling down a hill.
The potential energy of the Higgs can be written as a simple polynomial:
Varying m, Q and λ correspond to varying the shape of this potential energy landscape. Just with a ball rolling along a track, the Higgs field itself settles in to a minimum of the potential.
Now that we’ve observed the Higgs, we can plug the numbers in to see what this potential energy landscape looks like. Annoyingly, it’s upside down:
In short, the Higgs field - which literally controls the physical parameters of our known universe - is perched in a metastable position.
As with alpha decay in nuclear physics5, quantum effects give a nonzero probability for the Higgs field to pop out and roll away.
Granted, studies suggest that this metastability of the Higgs vacuum should last longer than the known age of the universe. Moreover, it doesn’t have an appreciable affect on our models of the earliest moments of our universe6. But it does raise the ante on concerns about naturalness and predictability in Physics.
If the Higgs vacuum really is finely tuned, then it might be finely tuned precisely because the universe just happened to land in a corresponding metastable state. Meaning: the masses of particles endowed by the Higgs, like the electron and the muon, the details of our very existence, might well be a pure cosmic accident7.
Or. As with the case of the charm quark and the GIM mechanism, it may well be resolved by the presence of new Physics. Only more, grinding scientific research can refine our understanding and potentially answer this question.
So do We Really Need New Particle Physics?
Probably, but not necessarily because of the Higgs instability. Physicist John Ellis has been writing about this routinely for decades.
“It used to be said that the nightmare scenario for the LHC would be to discover the Higgs boson and nothing else. However, the measured masses of the Higgs boson and the top quark may be hinting that there must be physics beyond the SM that stabilises the vacuum. Let us take heart from this argument, and keep looking for new physics, even if there is no guarantee of immediate success.”
The universe as we understand it could be “finely tuned”. Statistical anomalies aside, there is as yet no significant, direct experimental result that requires the existence of a new fundamental particle or interaction.
Instead, we likely need some new particle physics to explain the effects of Dark Matter.
It may well be that this Dark Matter - whatever it is - only interacts gravitationally. It may not interact via the weak nuclear force at all, which means it would have zero impact on the Higgs mass or its associated technical instability.
But let’s hope not.
That would actually be the nightmare scenario for Particle Physicists, since capturing and studying such a “gravitation only” particle seems like an intractable problem for the foreseeable future. Solving that would no doubt require revolutionary thinking.
Nevertheless, if there’s one thing to moralize from Kuhn’s Revolutions, it’s that any new understanding of how Nature works requires both time and hard work.
Which is still heroic, important and serious professional work!
For a counter point, see the previous footnote.
Unstable models of physics are typically associated with something new: new particles or a new phase of matter. Water, for example, is a pretty benign substance, unless you break it near its freezing or boiling point. Its behavior changes dramatically in either case. Usually particle physicists are more interested in so-called second order phase transitions, but hopefully flavor of unstable phenomena is clear by example.
We say virtual because they are a transient effect. Quantum mechanics allows particles to exist with all kinds of different masses - for example - so long as they don’t persist for too long. The precise value of “too long” here is governed by the physical parameter ħ, as it appears in the famous Uncertainty Relation.
There’s a whole story here in Five Parts. Here’s a Spotify link. We’ve also go one for Apple Podcasts.
More precisely, the reheating temperature after inflation seems to be well below the threshold for the popping the Higgs out of its metastable state.
There are many attempts to quantify what the probability of our version of the universe is relative to the state of whatever could have been, which is an inherently silly task given that we don’t know what the full set of possibilities are. Such publications - when honest - aim to be descriptive rather than prescriptive or even predictive. I’ve even written one of those papers nearly a decade ago! Khoury and Wong in particular has given an argument that would lead to universes rather like the one we currently have evidence for residing in.