Is particle physics dead, dying, or just hard?
36 points
1 hour ago
| 11 comments
| quantamagazine.org
| HN
bananaflag
42 minutes ago
[-]
It's basically the opposite situation from 150 years ago.

Back then, we thought our theory was more or less complete while having experimental data which disproved it (Michelson-Morley experiment, Mercury perihelion, I am sure there are others).

Right now, we know our theories are incomplete (since GR and QFT are incompatible) while having no experimental data which contradicts them.

reply
Paracompact
16 minutes ago
[-]
What about underexplained cosmological epicycles like dark matter (in explaining long-standing divergences of gravitational theory from observation), or the Hubble tension?
reply
XorNot
14 minutes ago
[-]
This is your regular reminder that epicycles were not an incorrect theory addition until an alternative hypothesis could explain the same behavior without requiring them.
reply
Paracompact
7 minutes ago
[-]
Sure, but in that regard dark matter is even more unsatisfying than (contemporary) epicycles, because not only does it add extra complexity, it doesn't even characterize the source of that complexity beyond its gravitational effects.
reply
cozzyd
3 minutes ago
[-]
Even better, there are the "nightmare" scenarios where dark matter can only interact gravitationally with Standard Model particles.
reply
klipt
16 minutes ago
[-]
Doesn't that imply our theories are "good enough" for all practical purposes? If they're impossible to empirically disprove?
reply
doctoboggan
11 minutes ago
[-]
The theories don't answer all the questions we can ask, namely questions about how gravity behaves at the quantum scale. (These questions pop up when exploring extremely dense regions of space - the very early universe and black holes).
reply
PlatoIsADisease
11 minutes ago
[-]
If I have to make a guess, we are at the level of pre-copernicus in particle physics.

We are finding local maximums(induction) but the establishment cannot handle deduction.

Everything is an overly complex bandaid. At some point someone will find something elegant that can predict 70% as good, and at some point we will realize: 'Oh that's great, the sun is actually at the center of the solar system, Copernicious was slightly wrong thinking planets make circular rotations. We just needed to use ellipses!'

But with particles.

reply
throw_m239339
11 minutes ago
[-]
I find the idea that reality might be quantized fascinating, so that all information that exists could be stored in a storage medium big enough.

It's also kind of interesting how causality allegedly has a speed limit and it's rather slow all things considered.

Anyway, in 150 years we absolutely came a long way, we'll figure it that out eventually, but as always, figuring it out might lead even bigger questions and mysteries...

reply
tasty_freeze
30 minutes ago
[-]
Here is one fact that seems, to me, pretty convincing that there is another layer underneath what we know.

The charge of electrons is -1 and protons +1. It has been experimentally measured out to 12 digits or so to be the same magnitude, just opposite charge. However, there are no theories why this is -- they are simply measured and that is it.

It beggars belief that these just happen to be exactly (as far as we can measure) the same magnitude. There almost certainly is a lower level mechanism which explains why they are exactly the same but opposite.

reply
Paracompact
22 minutes ago
[-]
Technically, the charge of a proton can be derived from its constituent 2 up quarks and 1 down quark, which have charges 2/3 and -1/3 respectively. I'm not aware of any deeper reason why these should be simple fractional ratios of the charge of the electron, however, I'm not sure there needs to be one. If you believe the stack of turtles ends somewhere, you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?
reply
JumpCrisscross
10 minutes ago
[-]
> you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?

No. It’s almost certainly not a coïncidence that these charges are symmetric like that (in stable particles that like to hang out together).

reply
Paracompact
1 minute ago
[-]
Whence your confidence? As they say in math, "There aren't enough small numbers to meet the many demands made of them." If we assume the turtle stack ends, and it ends simply (i.e. with small numbers), some of those numbers may wind up looking alike. Even more so if you find anthropic arguments convincing, or if you consider sampling bias (which may be what you mean by, "in stable particles that like to hang out together").
reply
hackyhacky
3 minutes ago
[-]
> coïncidence

Nïce

reply
andyfilms1
12 minutes ago
[-]
For a given calculation on given hardware, the 100th digit of a floating point decimal can be replicated every time. But that digit is basically just noise, and has no influence on the 1st digit.

In other words: There can be multiple "layers" of linked states, but that doesn't necessarily mean the lower layers "create" the higher layers, or vice versa.

reply
rjh29
21 minutes ago
[-]
One argument (while unsatisfying) is there are trillions of possible configurations, but ours is the one that happened to work which is why we're here to observe it. Changing any of them even a little bit would result in an empty universe.
reply
libraryofbabel
1 minute ago
[-]
There’s a name for that: the Anthropic principle. And it is deeply unsatisfying as an explanation.

And does it even apply here? If the charge on the electron differed from the charge on the proton at just the 12th decimal place, would that actually prevent complex life from forming. Citation needed for that one.

I agree with OP. The unexplained symmetry points to a deeper level.

reply
PaulHoule
28 minutes ago
[-]
If it wasn't the case then matter wouldn't be stable.
reply
libraryofbabel
25 seconds ago
[-]
Is that actually true, if the charges differed at the 12th decimal place only? That’s non-obvious to me.
reply
wvbdmp
24 minutes ago
[-]
Aren’t things like this usually explained by being the only viable configuration, or is that not the case here?
reply
throwup238
22 minutes ago
[-]
Or why the quarks that make up protons and neutrons have fractional charges, with +1 protons mixing two +2/3 up quarks and one -1/3 down quark, and the neutral neutron is one up quark and two down quarks. And where are all the other Quarks in all of this, busy tending bar?
reply
david-gpu
11 minutes ago
[-]
They have fractional charges because that is how we happen to measure charge. If our unit of charge had been set when we knew about quarks, we would have chosen those as fundamental, and the charge of the electron would instead be -3.

Now, the ratios between these charges appear to be fundamental. But the presence of fractions is arbitrary.

reply
jiggawatts
6 minutes ago
[-]
This is "expected" from theory, because all particles seem to be just various aspects of the "same things" that obey a fairly simple algebra.

For example, pair production is:

    photon + photon = electron + (-)electron
You can take that diagram, rotate it in spacetime, and you have the direct equivalent, which is electrons changing paths by exchanging a photon:

   electron + photon = electron - photon
There are similar formulas for beta decay, which is:

   proton = neutron + electron + (-)neutrino
You can "rotate" this diagram, or any other Feyman diagram. This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.

The precise why of this algebra is the big question! People are chipping away at it, and there's been slow but steady progress.

One of the "best" approaches I've seen is "The Harari-Shupe preon model and nonrelativistic quantum phase space"[1] by Piotr Zenczykowski which makes the claim that just like how Schrodinger "solved" the quantum wave equation by using complex numbers, it's possible to solve a slightly extended version of the same equation in 6D phase space, yielding matrices that have properties that match the Harari-Shupe preon model. The preon model claims that fundamental particles are further subdivided into preons, the "charges" of which neatly add up to the observed zoo of particle charges. The preon model has issues with particle masses and binding energies, but Piotr's work neatly sidesteps that issue by claiming that the preons aren't "particles" as such, but just mathematical properties of these matrices.

I put "best" in quotes above because there isn't anything remotely like a widely accepted theory for this yet, just a few clever people throwing ideas at the wall to see what sticks.

[1] https://arxiv.org/abs/0803.0223

reply
GMoromisato
33 minutes ago
[-]
The use of "AI" in particle physics is not new. In 1999 they were using neural nets to compute various results. Here's one from Measurement of the top quark pair production cross section in p¯p collisions using multijet final states [https://repository.ias.ac.in/36977/1/36977.pdf]

"The analysis has been optimized using neural networks to achieve the smallest expected fractional uncertainty on the t¯t production cross section"

reply
jdshaffer
29 minutes ago
[-]
I remember back in 1995 or so being in a professor's office at Indiana University and he was talking about trying to figure out how to use Neural Networks to automatically track particle trails in bubble chamber results. He was part of a project at CERN at the time. So, yeah, they've been using NNs for quite awhile. :-)
reply
aatd86
33 minutes ago
[-]
Isn't it the mathematics that is lagging? Amplituhedron? Higher dimensional models?

Fun fact: I got to read the thesis of one my uncles who was a young professor back in the 90's. Right when they were discovering bosons. They were already modelling them as tensors back then. And probably multilinear transformations.

Now that I am grown I can understand a little more, I was about 10 years old back then. I had no idea he was studying and teaching the state of the art. xD

reply
jahnu
45 minutes ago
[-]
I find the arguments from those who say there is no crisis convincing. Progress doesn’t happen at a constant rate. We made incredible unprecedented progress in the 20th century. The most likely scenario is that to slow down for a while. Perhaps hundreds of years again! Nobody can know. We are still making enormous strides compared to most of scientific history.
reply
Insanity
43 minutes ago
[-]
Although we do have many more people now working on these problems than any time in the past. That said, science progresses one dead scientist at the time so might still take generations for a new golden era.
reply
davidw
35 minutes ago
[-]
It's impossible to tell without opening the box the particle physics is in.
reply
gowld
31 minutes ago
[-]
Information content of the article:

The discovery of the Higgs boson in 2012 completed the Standard Model of particle physics, but the field has since faced a "crisis" due to the lack of new discoveries. The Large Hadron Collider (LHC) has not found any particles or forces beyond the Standard Model, defying theoretical expectations that additional particles would appear to solve the "hierarchy problem"—the unnatural gap between the Higgs mass and the Planck scale. This absence of new physics challenged the "naturalness" argument that had long guided the field.

In 2012, physicist Adam Falkowski predicted the field would undergo a slow decay without new discoveries. Reviewing the state of the field in 2026, he maintains that experimental particle physics is indeed dying, citing a "brain drain" where talented postdocs are leaving the field for jobs in AI and data science. However, the LHC remains operational and is expected to run for at least another decade.

Artificial intelligence is now being integrated into the field to improve data handling. AI pattern recognizers are classifying collision debris more accurately than human-written algorithms, allowing for more precise measurements of "scattering amplitude" or interaction probabilities. Some physicists, like Matt Strassler, argue that new physics might not lie at higher energies but could be hidden in "unexplored territory" at lower energies, such as unstable dark matter particles that decay into muon-antimuon pairs.

CERN physicists have proposed a Future Circular Collider (FCC), a 91-kilometer tunnel that would triple the circumference of the LHC. The plan involves first colliding electrons to measure scattering amplitudes precisely, followed by proton collisions at energies roughly seven times higher than the LHC later in the century. Formal approval and funding for this project are not expected before 2028.

Meanwhile, U.S. physicists are pursuing a muon collider. Muons are elementary particles like electrons but are 200 times heavier, allowing for high-energy, clean collisions. The challenge is that muons are highly unstable and decay in microseconds, requiring rapid acceleration. A June 2025 national report endorsed the program, which is estimated to take about 30 years to develop and cost between $10 and $20 billion.

China has reportedly moved away from plans to build a massive supercollider. Instead, they are favoring a cheaper experiment costing hundreds of millions of dollars—a "super-tau-charm facility"—designed to produce tau particles and charm quarks at lower energies.

On the theoretical side, some researchers have shifted to "amplitudeology," the abstract mathematical study of scattering amplitudes, in hopes of reformulating particle physics equations to connect with quantum gravity. Additionally, Jared Kaplan, a former physicist and co-founder of the AI company Anthropic, suggests that AI progress is outpacing scientific experimentation, positing that future colliders or theoretical breakthroughs might eventually be designed or discovered by AI rather than humans.

reply
ktallett
43 minutes ago
[-]
Is it more that even the most dedicated and passionate researchers have to frame their interests in a way that will get funding? Particle Physics right now is not the thing those with the cash will fund right now. AI and QC is the focus.
reply
Legend2440
35 minutes ago
[-]
Well, it's hard to make an argument for a $100 billion collider when your $10 billion collider didn't find anything revolutionary.

Scaling up particle colliders has arguably hit diminishing returns.

reply
bsder
33 minutes ago
[-]
Theoretical physics progresses via the anomalies it can't explain.

The problem is that we've mostly explained everything we have easy access to. We simply don't have that many anomalies left. Theoretical physicists were both happy and disappointed that the LHC simply verified everything--theories were correct, but there weren't really any pointers to where to go next.

Quantum gravity seems to be the big one, but that is not something we can penetrate easily. LIGO just came online, and could only really detect enormous events (like black hole mergers).

And while we don't always understand what things do as we scale up or in the aggregate, that doesn't require new physics to explain.

reply
tehjoker
1 hour ago
[-]
It's kind of legitimate, but it's kind of sad to see some of the smartest people in society just being like "maybe AI will just give me the answer," a phrase that has a lot of potential to be thought terminating.
reply
emmelaich
53 minutes ago
[-]
That's mentioned in the article too:

>Cari Cesarotti, a postdoctoral fellow in the theory group at CERN, is skeptical about that future. She notices chatbots’ mistakes, and how they’ve become too much of a crutch for physics students. “AI is making people worse at physics,” she said.

reply
yalok
30 minutes ago
[-]
this. Deep understanding of physics involves building a mental model & intuition how things work, and the process of building is what gives the skill to deduce & predict. Using AI to just get to the answers directly prevents building that "muscle" strength...
reply
0x3f
28 minutes ago
[-]
I'm quite happy that it might give me, with pre-existing skills, more time on the clock to stay relevant.
reply
meindnoch
14 minutes ago
[-]
Maybe it's time for physicists to switch to agile? Don't try to solve the theory of the Universe at once; that's the waterfall model. Try to come up with just a single new equation each sprint!
reply