What comes after the LHC? – The P5 Report & Future Colliders

This is the second part of our coverage of the P5 report and its implications for particle physics. To read the first part, click here

One of the thorniest questions in particle physics is ‘What comes after the LHC?’. This was one of the areas people were most uncertain what the P5 report would say. Globally, the field is trying to decide what to do once the LHC winds down in ~2040 While the LHC is scheduled to get an upgrade in the latter half of the decade and run until the end of the 2030’s, the field must start planning now for what comes next. For better or worse, big smash-y things seem to capture a lot of public interest, so the debate over what large collider project to build has gotten heated. Even Elon Musk is tweeting (X-ing?) memes about it.

Famously, the US’s last large accelerator project, the Superconducting Super Collider (SSC), was cancelled in the ’90s partway through its construction. The LHC’s construction itself often faced perilous funding situations, and required a CERN to make the unprecedented move of taking a loan to pay for its construction. So no one takes for granted that future large collider projects will ultimately come to fruition.

Desert or Discovery?

When debating what comes next, dashed hopes of LHC discoveries are top of mind. The LHC experiments were primarily designed to search for the Higgs boson, which they successfully found in 2012. However, many had predicted (perhaps over-confidently) it would also discover a slew of other particles, like those from supersymmetry or those heralding extra-dimensions of spacetime. These predictions stemmed from a favored principle of nature called ‘naturalness’ which argued additional particles nearby in energy to the Higgs were needed to keep its mass at a reasonable value. While there is still much LHC data to analyze, many searches for these particles have been performed so far and no signs of these particles have been seen.

These null results led to some soul-searching within particle physics. The motivations behind the ‘naturalness’ principle that said the Higgs had to be accompanied by other particles has been questioned within the field, and in New York Times op-eds.

No one questions that deep mysteries like the origins of dark matter, matter anti-matter asymmetry, and neutrino masses, remain. But with the Higgs filling in the last piece of the Standard Model, some worry that answers to these questions in the form of new particles may only exist at energy scales entirely out of the reach of human technology. If true, future colliders would have no hope of

A diagram of the particles of the Standard Model laid out as a function of energy. The LHC and other experiments have probed up to around 10^3 GeV, and found all the particles of the Standard Model. Some worry new particles may only exist at the extremely high energies of the Planck or GUT energy scales. This would imply a large large ‘desert’ in energy, many orders of magnitude in which no new particles exist. Figure adapted from here

The situation being faced now is qualitatively different than the pre-LHC era. Prior to the LHC turning on, ‘no lose theorems’, based on the mathematical consistency of the Standard Model, meant that it had to discover the Higgs or some other new particle like it. This made the justification for its construction as bullet-proof as one can get in science; a guaranteed Nobel prize discovery. But now with the last piece of the Standard Model filled in, there are no more free wins; guarantees of the Standard Model’s breakdown don’t occur until energy scales we would need solar-system sized colliders to probe. Now, like all other fields of science, we cannot predict what discoveries we may find with future collider experiments.

Still, optimists hope, and have their reasons to believe, that nature may not be so unkind as to hide its secrets behind walls so far outside our ability to climb. There are compelling models of dark matter that live just outside the energy reach of the LHC, and predict rates too low for direct detection experiments, but would be definitely discovered or ruled out by high energy colliders. The nature of the ‘phase transition’ that occurred in the very early universe, which may explain the prevalence of matter over anti-matter, can also be answered. There are also a slew of experimentalhints‘, all of which have significant question marks, but could point to new particles within the reach of a future collider.

Many also just advocate for building a future machine to study nature itself, with less emphasis on discovering new particles. They argue that even if we only further confirm the Standard Model, it is a worthwhile endeavor. Though we calculate Standard Model predictions for high energies, unless they are tested in a future collider we will not ‘know’ how if nature actually works like this until we test it in those regimes. They argue this is a fundamental part of the scientific process, and should not be abandoned so easily. Chief among the untested predictions are those surrounding the Higgs boson. The Higgs is a central somewhat mysterious piece of the Standard Model but is difficult to measure precisely in the noisy environment of the LHC. Future colliders would allow us to study it with much better precision, and verify whether it behaves as the Standard Model predicts or not.

Projects

These theoretical debates directly inform what colliders are being proposed and what their scientific case is.

Many are advocating for a “Higgs factory”, a collider of based on clean electron-positron collisions that could be used to study the Higgs in much more detail than the messy proton collisions of the LHC. Such a machine would be sensitive to subtle deviations of Higgs behavior from Standard Model predictions. Such deviations could come from the quantum effects of heavy, yet-undiscovered particles interacting with the Higgs. However, to determine what particles are causing those deviations, its likely one would need a new ‘discovery’ machine which has high enough energy to produce them.

Among the Higgs factory options are the International Linear Collider, a proposed 20km linear machine which would be hosted in Japan. ILC designs have been ‘ready to go’ for the last 10 years but the Japanese government has repeated waffled on whether to approve the project. Sitting in limbo for this long has led to many being pessimistic about the projects future, but certainly many in the global community would be ecstatic to work on such a machine if it was approved.

Designs for the ILC have been ready for nearly a decade, but its unclear if it will receive the greenlight from the Japanese government. Image source

Alternatively, some in the US have proposed building a linear collider based on a ‘cool copper’ cavities (C3) rather than the standard super conducting ones. These copper cavities can achieve more acceleration per meter than the standard super conducting ones, meaning a linear Higgs factory could be constructed with a reduced 8km footprint. A more compact design can significantly cut down on infrastructure costs that governments usually don’t like to use their science funding on. Advocates had proposed it as a cost-effective Higgs factory option, whose small footprint means it could potentially hosted in the US.

The Future-Circular-Collider (FCC), CERN’s successor to the LHC, would kill both birds with one extremely long stone. Similar to the progression from LEP to the LHC, this new proposed 90km collider would run as Higgs factory using electron-positron collisions starting in 2045 before eventually switching to a ~90 TeV proton-proton collider starting in ~2075.

An image of the proposed FCC overlayed on a map of the French/Swiss border
Designs for the massive 90km FCC ring surrounding Geneva

Such a machine would undoubtably answer many of the important questions in particle physics, however many have concerns about the huge infrastructure costs needed to dig such a massive tunnel and the extremely long timescale before direct discoveries could be made. Most of the current field would not be around 50 years from now to see what such a machine finds. The Future-Circular-Collider (FCC), CERN’s successor to the LHC, would kill both birds with one extremely long stone. Similar to the progression from LEP to the LHC, this new proposed 90km collider would run as Higgs factory using electron-positron collisions starting in 2045 before eventually switching to a ~90 TeV proton-proton collider starting in ~2075. Such a machine would undoubtably answer many of the important questions in particle physics, however many have concerns about the extremely long timescale before direct discoveries could be made. Most of the current field would not be around 50 years from now to see what such a machine finds. The FCC is also facing competition as Chinese physicists have proposed a very similar design (CEPC) which could potentially start construction much earlier.

During the snowmass process many in the US starting pushing for an ambitious alternative. They advocated a new type of machine that collides muons, the heavier cousin of electrons. A muon collider could reach the high energies of a discovery machine while also maintaining a clean environment that Higgs measurements can be performed in. However, muons are unstable, and collecting enough of them into formation to form a beam before they decay is a difficult task which has not been done before. The group of dedicated enthusiasts designed t-shirts and Twitter memes to capture the excitement of the community. While everyone agrees such a machine would be amazing, the key technologies necessary for such a collider are less developed than those of electron-positron and proton colliders. However, if the necessary technological hurdles could be overcome, such a machine could turn on decades before the planned proton-proton run of the FCC. It can also presents a much more compact design, at only 10km circumfrence, roughly three times smaller than the LHC. Advocates are particularly excited that this would allow it to be built within the site of Fermilab, the US’s flagship particle physics lab, which would represent a return to collider prominence for the US.

A proposed design for a muon collider. It relies on ambitious new technologies, but could potentially deliver similar physics to the FCC decades sooner and with a ten times smaller footprint. Source

Deliberation & Decision

This plethora of collider options, each coming with a very different vision of the field in 25 years time led to many contentious debates in the community. The extremely long timescales of these projects led to discussions of human lifespans, mortality and legacy being much more being much more prominent than usual scientific discourse.

Ultimately the P5 recommendation walked a fine line through these issues. Their most definitive decision was to recommend against a Higgs factor being hosted in the US, a significant blow to C3 advocates. The panel did recommend US support for any international Higgs factories which come to fruition, at a level ‘commensurate’ with US support for the LHC. What exactly ‘comensurate’ means in this context I’m sure will be debated in the coming years.

However, the big story to many was the panel’s endorsement of the muon collider’s vision. While recognizing the scientific hurdles that would need to be overcome, they called the possibility of muon collider hosted in the US a scientific ‘muon shot‘, that would reap huge gains. They therefore recommended funding for R&D towards they key technological hurdles that need to be addressed.

Because the situation is unclear on both the muon front and international Higgs factory plans, they recommended a follow up panel to convene later this decade when key aspects have clarified. While nothing was decided, many in the muon collider community took the report as a huge positive sign. While just a few years ago many dismissed talk of such a collider as fantastical, now a real path towards its construction has been laid down.

Hitoshi Murayama, chair of the P5 committee, cuts into a ‘Shoot for the Muon’ cake next to a smiling Lia Merminga, the director of Fermilab. Source

While the P5 report is only one step along the path to a future collider, it was an important one. Eyes will now turn towards reports from the different collider advocates. CERN’s FCC ‘feasibility study’, updates around the CEPC and, the International Muon Collider Collaboration detailed design report are all expected in the next few years. These reports will set up the showdown later this decade where concrete funding decisions will be made.

For those interested the full report as well as executive summaries of different areas can be found on the P5 website. Members of the US particle physics community are also encouraged to sign the petition endorsing the recommendations here.

Moriond 2023 Recap

Every year since 1966,  particle physicists have gathered in the Alps to unveil and discuss their most important results of the year (and to ski). This year I had the privilege to attend the Moriond QCD session so I thought I would post a recap here. It was a packed agenda spanning 6 days of talks, and featured a lot of great results over many different areas of particle physics, so I’ll have to stick to the highlights here.

FASER Observes First Collider Neutrinos

Perhaps the most exciting result of Moriond came from the FASER experiment, a small detector recently installed in the LHC tunnel downstream from the ATLAS collision point. They announced the first ever observation of neutrinos produced in a collider. Neutrinos are produced all the time in LHC collisions, but because they very rarely interact, and current experiments were not designed to look for them, no one had ever actually observed them in a detector until now. Based on data collected during collisions from last year, FASER observed 153 candidate neutrino events, with a negligible amount of predicted backgrounds; an unmistakable observation.

Black image showing colorful tracks left by particles produced in a neutrino interaction
A neutrino candidate in the FASER emulsion detector. Source

This first observation opens the door for studying the copious high energy neutrinos produced in colliders, which sit in an energy range currently unprobed by other neutrino experiments. The FASER experiment is still very new, so expect more exciting results from them as they continue to analyze their data. A first search for dark photons was also released which should continue to improve with more luminosity. On the neutrino side, they have yet to release full results based on data from their emulsion detector which will allow them to study electron and tau neutrinos in addition to the muon neutrinos this first result is based on.

New ATLAS and CMS Results

The biggest result from the general purpose LHC experiments was ATLAS and CMS both announcing that they have observed the simultaneous production of 4 top quarks. This is one of the rarest Standard Model processes ever observed, occurring a thousand times less frequently than a Higgs being produced. Now that it has been observed the two experiments will use Run-3 data to study the process in more detail in order to look for signs of new physics.

Event displays from ATLAS and CMS showing the signature of 4 top events in their respective detectors
Candidate 4 top events from ATLAS (left) and CMS (right).

ATLAS also unveiled an updated measurement of the mass of the W boson. Since CDF announced its measurement last year, and found a value in tension with the Standard Model at ~7-sigma, further W mass measurements have become very important. This ATLAS result was actually a reanalysis of their previous measurement, with improved PDF’s and statistical methods. Though still not as precise as the CDF measurement, these improvements shrunk their errors slightly (from 19 to 16 MeV).  The ATLAS measurement reports a value of the W mass in very good agreement with the Standard Model, and approximately 4-sigma in tension with the CDF value. These measurements are very complex, and work is going to be needed to clarify the situation.

CMS had an intriguing excess (2.8-sigma global) in a search for a Higgs-like particle decaying into an electron and muon. This kind of ‘flavor violating’ decay would be a clear indication of physics beyond the Standard Model. Unfortunately it does not seem like ATLAS has any similar excess in their data.

Status of Flavor Anomalies

At the end of 2022, LHCb announced that the golden channel of the flavor anomalies, the R(K) anomaly, had gone away upon further analysis. Many of the flavor physics talks at Moriond seemed to be dealing with this aftermath.

Of the remaining flavor anomalies, R(D), a ratio describing the decay rates of B mesons in final states with D mesons and taus versus D mesons plus muons or electrons, has still been attracting interest. LHCb unveiled a new measurement that focused on hadronically taus and found a value that agreed with the Standard Model prediction. However this new measurement had larger error bars than others so it only brought down the world average slightly. The deviation currently sits at around 3-sigma.

A summary plot showing all the measurements of R(D) and R(D*). The newest LHCb measurement is shown in the red band / error bar on the left. The world average still shows a 3-sigma deviation to the SM prediction

An interesting theory talk pointed out that essentially any new physics which would produce a deviation in R(D) should also produce a deviation in another lepton flavor ratio, R(Λc), because it features the same b->clv transition. However LHCb’s recent measurement of R(Λc) actually found a small deviation in the opposite direction as R(D). The two results are only incompatible at the ~1.5-sigma level for now, but it’s something to continue to keep an eye on if you are following the flavor anomaly saga.

It was nice to see that the newish Belle II experiment is now producing some very nice physics results. The highlight of which was a world-best measurement of the mass of the tau lepton. Look out for more nice Belle II results as they ramp up their luminosity, and hopefully they can weigh in on the R(D) anomaly soon.

A fit to the invariant mass the visible decay products of the tau lepton, used to determine its intrinsic mass. An impressive show of precision from Belle II

Theory Pushes for Precision

The focus of much of the theory talks was about trying to advance our precision in predictions of standard model physics. This ‘bread and butter’ physics is sometimes overlooked in scientific press, but is an absolutely crucial part of the particle physics ecosystem. As experiments reach better and better precision, improved theory calculations are required to accurately model backgrounds, predict signals, and have precise standard model predictions to compare to so that deviations can be spotted. Nice results in this area included evidence for an intrinsic amount of charm quarks inside the proton from the NNPDF collaboration, very precise extraction of CKM matrix elements by using lattice QCD, and two different proposals for dealing with tricky aspects regarding the ‘flavor’ of QCD jets.

Final Thoughts

Those were all the results that stuck out to me. But this is of course a very biased sampling! I am not qualified enough to point out the highlights of the heavy ion sessions or much of the theory presentations. For a more comprehensive overview, I recommend checking out the slides for the excellent experimental and theoretical summary talks. Additionally there was the Moriond Electroweak conference that happened the week before the QCD one, which covers many of the same topics but includes neutrino physics results and dark matter direct detection. Overall it was a very enjoyable conference and really showcased the vibrancy of the field!

The Search for Simplicity : The Higgs Boson’s Self Coupling

When students first learn quantum field theory, the mathematical language the underpins the behavior of elementary particles, they start with the simplest possible interaction you can write down : a particle with no spin and no charge scattering off another copy of itself. One then eventually moves on to the more complicated interactions that describe the behavior of fundamental particles of the Standard Model. They may quickly forget this simplified interaction as a unrealistic toy example, greatly simplified compared to the complexity the real world. Though most interactions that underpin particle physics are indeed quite a bit more complicated, nature does hold a special place for simplicity. This barebones interaction is predicted to occur in exactly one scenario : a Higgs boson scattering off itself. And one of the next big targets for particle physics is to try and observe it.

A feynman diagram consisting of two dotted lines coming merging together to form a single line.
A Feynman diagram of the simplest possible interaction in quantum field theory, a spin-zero particle interacting with itself.

The Higgs is the only particle without spin in the Standard Model, and the only one that doesn’t carry any type of charge. So even though particles such as gluons can interact with other gluons, its never two of the same kind of gluons (the two interacting gluons will always carry different color charges). The Higgs is the only one that can have this ‘simplest’ form of self-interaction. Prominent theorist Nima Arkani-Hamed has said that the thought of observing this “simplest possible interaction in nature gives [him] goosebumps“.

But more than being interesting for its simplicity, this self-interaction of the Higgs underlies a crucial piece of the Standard Model: the story of how particles got their mass. The Standard Model tells us that the reason all fundamental particles have mass is their interaction with the Higgs field. Every particle’s mass is proportional to the strength of the Higgs field. The fact that particles have any mass at all is tied to the fact that the lowest energy state of the Higgs field is at a non-zero value. According to the Standard Model, early in the universe’s history when the temperature were much higher, the Higgs potential had a different shape, with its lowest energy state at field value of zero. At this point all the particles we know about were massless. As the universe cooled the shape of the Higgs potential morphed into a ‘wine bottle’ shape, and the Higgs field moved into the new minimum at non-zero value where it sits today. The symmetry of the initial state, in which the Higgs was at the center of its potential, was ‘spontaneously broken’  as its new minimum, at a location away from the center, breaks the rotation symmetry of the potential. Spontaneous symmetry breaking is a very deep theoretical idea that shows up not just in particle physics but in exotic phases of matter as well (eg superconductors). 

A diagram showing the ‘unbroken’ Higgs potential in the very early universe (left) and the ‘wine bottle’ shape it has today (right). When the Higgs at the center of its potential it has a rotational symmetry, there are no preferred directions. But once it finds it new minimum that symmetry is broken. The Higgs now sits at a particular field value away from the center and a preferred direction exists in the system. 

This fantastical story of how particle’s gained their masses, one of the crown jewels of the Standard Model, has not yet been confirmed experimentally. So far we have studied the Higgs’s interactions with other particles, and started to confirm the story that it couples to particles in proportion to their mass. But to confirm this story of symmetry breaking we will to need to study the shape of the Higgs’s potential, which we can probe only through its self-interactions. Many theories of physics beyond the Standard Model, particularly those that attempt explain how the universe ended up with so much matter and very little anti-matter, predict modifications to the shape of this potential, further strengthening the importance of this measurement.

Unfortunately observing the Higgs interacting with itself and thus measuring the shape of its potential will be no easy feat. The key way to observe the Higgs’s self-interaction is to look for a single Higgs boson splitting into two. Unfortunately in the Standard Model additional processes that can produce two Higgs bosons quantum mechanically interfere with the Higgs self interaction process which produces two Higgs bosons, leading to a reduced production rate. It is expected that a Higgs boson scattering off itself occurs around 1000 times less often than the already rare processes which produce a single Higgs boson.  A few years ago it was projected that by the end of the LHC’s run (with 20 times more data collected than is available today), we may barely be able to observe the Higgs’s self-interaction by combining data from both the major experiments at the LHC (ATLAS and CMS).

Fortunately, thanks to sophisticated new data analysis techniques, LHC experimentalists are currently significantly outpacing the projected sensitivity. In particular, powerful new machine learning methods have allowed physicists to cut away background events mimicking the di-Higgs signal much more than was previously thought possible. Because each of the two Higgs bosons can decay in a variety of ways, the best sensitivity will be obtained by combining multiple different ‘channels’ targeting different decay modes. It is therefore going to take a village of experimentalists each working hard to improve the sensitivity in various different channels to produce the final measurement. However with the current data set, the sensitivity is still a factor of a few away from the Standard Model prediction. Any signs of this process are only expected to come after the LHC gets an upgrade to its collision rate a few years from now.

Limit plots on HH production in various different decay modes.
Current experimental limits on the simultaneous production of two Higgs bosons, a process sensitive to the Higgs’s self-interaction, from ATLAS (left) and CMS (right). The predicted rate from the Standard Model is shown in red in each plot while the current sensitivity is shown with the black lines. This process is searched for in a variety of different decay modes of the Higgs (various rows on each plot). The combined sensitivity across all decay modes for each experiment allows them currently to rule out the production of two Higgs bosons at 3-4 times the rate predicted by the Standard Model. With more data collected both experiments will gain sensitivity to the range predicted by the Standard Model.

While experimentalists will work as hard as they can to study this process at the LHC, to perform a precision measurement of it, and really confirm the ‘wine bottle’ shape of the potential, its likely a new collider will be needed. Studying this process in detail is one of the main motivations to build a new high energy collider, with the current leading candidates being an even bigger proton-proton collider to succeed the LHC or a new type of high energy muon collider.

Various pictorial representations of the uncertainty on the Higgs potential shape.
A depiction of our current uncertainty on the shape of the Higgs potential (center), our expected uncertainty at the end of the LHC (top right) and the projected uncertainty a new muon collider could achieve (bottom right). The Standard Model expectation is the tan line and the brown band shows the experimental uncertainty. Adapted from Nathaniel Craig’s talkhere

The quest to study nature’s simplest interaction will likely span several decades. But this long journey gives particle physicists a roadmap for the future, and a treasure worth traveling great lengths for.

Read More:

CERN Courier Interview with Nima Arkani-Hamed on the future of Particle Physics on the importance of the Higgs’s self-coupling

Wikipedia Article and Lecture Notes on Spontaneous symmetry breaking

Recent ATLAS Measurements of the Higgs Self Coupling

Stretching the limits of dark matter searches with springy detectors

Title: “The Piezoaxionic Effect”

Authors: Asimina Arvanitaki, Amalia Madden, Ken Van Tilburg

Link: https://arxiv.org/abs/2112.11466

We can’t find the missing five-sixths of the universe called dark matter because it doesn’t collide in detectors — but what if it shakes them? Today’s paper theorizes a new kind of stretchy detector that rapidly shrinks and expands in dark matter’s presence, creating a tiny vibration that can be measured.

Old hypothesis, new effect

Many physicists think dark matter might be made up of axions. These hypothetical particles would simultaneously explain dark matter and solve the “strong CP problem”, another gap in our understanding of particle physics.

Axions are so light that they behave more like a wave than a particle, so most attempts to find them rely on some sort of oscillatory signal they would cause in a detector. Under the right conditions, the omnipresent axion field can cause neutrons to gyrate or create electromagnetic waves, so physicists build experiments that resonate at just the right frequency to pick out these axion-induced oscillations.

Detectors called haloscopes pick up electromagnetic waves of a particular frequency. Looking for axions with a haloscope is like tuning an FM radio and trying to find a particular, very faint song, but without knowing which station it’s on. If physicists can pick out the song from the loud sea of static, the frequency they find it at will tell them the axion’s mass.

Fig. 1: Animation of the piezoelectric effect. Strains in the object create a voltage difference between its two sides. Inversely, applying a voltage across the sides causes the material to stretch or shrink. © User:Tizeff / Wikimedia Commons / CC-BY-SA-3.0

If the axion is too light to resonate a haloscope, a different type of resonator will be needed to find it. Today’s paper finds that as axions pass through certain special materials, they exert a minuscule oscillatory tug on the atoms in the material. The authors coined the term “piezoaxionic effect” for this phenomenon, an analogy to the piezoelectric effect in which EM waves pull and stretch out certain crystals, as shown in Fig. 1. In the same way, they write, axion waves should cause crystals to repeatedly stretch and shrink, like a slinky suspended from one end. Again, the frequency of these oscillatory changes in length depends on the axion’s mass. In most cases, they are too small to notice, but if the axion matches the crystal’s resonant frequency, the vibrations might get amplified enough to detect.

Stretchy detectors

Fig. 2: The strength of axion signal to which various experimental probes are sensitive, as a function of the axion’s mass. We expect axions to appear somewhere in the green band, so experiments aim for sensitivities that dip below it. The blue and red bands represent one larger and one smaller set of the proposed stretchy detectors, respectively. The broader gold band highlights what might be achievable if scientists figure out how to vary the detectors’ resonant frequency to match a given axion mass.

The paper proposes a detector made of these crystals that tries to measure their vibrations. Since every piezoaxionic crystal is also piezoelectric, its stretching and shrinking will create an oscillating electrical voltage between its edges. The authors calculated the size of this voltage signal, and compared it with the noise levels they expect if they use fancy quantum sensors for the measurement. This gave them an estimate of the crystals’ sensitivity to axions.

But a given detector is only really sensitive if the axion mass is near its resonant frequency, so it would take lots of detectors to test for a range of masses. Fig. 2 shows the sensitivity of a proposed experiment, which would operate for ten years using two sets of detectors, one with arrays of millimeter-scale crystals (red), and one with arrays of centimeter-scale crystals (blue). The green area shows where we expect axions to live within this two-dimensional space. To discover axions, you need an experiment whose sensitivity dips fully below the green band at their actual mass (which, remember, we don’t know). The gold curve is meant to highlight the technology’s future potential if physicists can figure out a way to tune the resonant frequency of a crystal detector, like tuning a haloscope.

Are axions having a moment?

Axions have been receiving heightened attention recently, since giant detectors buried underground have failed to prove the other leading dark matter theory, that of the weakly interacting massive particle. As that long-favored hypothesis becomes tenuous, many physicists are looking in new directions, and the axion is the readiest alternative.

The vibrational detectors would also probe a different range of axion masses than existing experimental efforts like haloscopes. Since the axion mass is such a huge unknown, a number of different technologies will be required to cover the full range of possibilities. Especially if the tuning of a detector’s resonant frequency becomes possible, this might become a critical new tool for dark matter hunters trying to excavate this parameter space. Perhaps in the coming years, it will be the buzz of one of these vibrating detectors that finally alerts us to dark matter’s true nature.

Read More

“Axion dark matter: What is it and why now?” – Review of axions in Science Advances

“New Results from HAYSTAC’s Phase II Operation with a Squeezed State Receiver” – Recent preprint from a leading haloscope experiment

“Experimental Searches for the Axion and Axion-Like Particles” – Review of experiments trying to discover axions

A world with no weak forces

Gravity, electromagnetism, strong, and weak — these are the beating hearts of the universe, the four fundamental forces. But do we really need the last one for us to exist?

Harnik, Kribs and Perez went about building a world without weak interactions and showed that, indeed, life as we know it could emerge there. This was a counter-proof by example to a famous anthropic argument by Agrawal, Barr, Donoghue and Seckel for the puzzling tininess of the weak scale, i.e. the electroweak hierarchy problem.

Summary of the argument in hep-ph/9707380 that a tiny Higgs mass (in Planck mass units) is necessary for life to develop.

Let’s ask first: would the Sun be there in a weakless universe? Sunshine is the product of proton fusion, and that’s the strong force. However, the reaction chain is ignited by the weak force!

image: Eric G. Blackman

So would no stars shine in a weakless world? Amazingly, there’s another route to trigger stellar burning: deuteron-proton fusion via the strong force! In our world, gas clouds collapsing into stars do not take this option because deuterons are very rare, with protons outnumbering them by 50,000. But we need not carry this, er, weakness into our gedanken universe. We can tune the baryon-to-photon ratio — whose origin is unknown — so that we end up with roughly as many deuterons as protons from the primordial synthesis of nuclei. Harnik et al. go on to show that, as in our universe, elements up to iron can be cooked in weakless stars, that they live for billions of years, and may explode in supernovae that disperse heavy elements into the interstellar medium.

source: hep-ph/0604027

A “weakless” universe is arranged by elevating the electroweak scale or the Higgs vacuum expectation value (\approx 246 GeV) to, say, the Planck scale (\approx 10^{19} GeV). To get the desired nucleosynthesis, care must be taken to keep the u, d, s quarks and the electron at their usual mass by tuning the Yukawa couplings, which are technically natural.

And let’s not forget dark matter. To make stars, one needs galaxy-like structures. And to make those, density perturbations must be gravitationally condensed by a large population of matter. In the weakless world of Harnik et al., hyperons make up some of the dark matter, but you would also need much other dark stuff such as your favourite non-WIMP.

If you believe in the string landscape, a weakless world isn’t just a hypothetical. Someone somewhere might be speculating about a habitable universe with a fourth fundamental force, explaining to their bemused colleagues: “It’s kinda like the strong force, only weak…”

xkcd.com/1489


Bibliography

Viable range of the mass scale of the standard model
V. Agrawal, S. M. Barr, J. F. Donoghue, D. Seckel, Phys.Rev.D 57 (1998) 5480-5492.

A Universe without weak interactions
R. Harnik, G. D. Kribs, G. Perez, Phys.Rev.D 74 (2006) 035006

Further reading

Gedanken Worlds without Higgs: QCD-Induced Electroweak Symmetry Breaking
C. Quigg, R. Shrock, Phys.Rev.D 79 (2009) 096002

The Multiverse and Particle Physics
J. F. Donoghue, Ann.Rev.Nucl.Part.Sci. 66 (2016)

The eighteen arbitrary parameters of the standard model in your everyday life
R. N. Cahn, Rev. Mod. Phys. 68, 951 (1996)

LHCb’s Xmas Letdown : The R(K) Anomaly Fades Away

Just before the 2022 holiday season LHCb announced it was giving the particle physics community a highly anticipated holiday present : an updated measurement of the lepton flavor universality ratio R(K).  Unfortunately when the wrapping paper was removed and the measurement revealed,  the entire particle physics community let out a collective groan. It was not shiny new-physics-toy we had all hoped for, but another pair of standard-model-socks.

The particle physics community is by now very used to standard-model-socks, receiving hundreds of pairs each year from various experiments all over the world. But this time there had be reasons to hope for more. Previous measurements of R(K) from LHCb had been showing evidence of a violation one of the standard model’s predictions (lepton flavor universality), making this triumph of the standard model sting much worse than most.

R(K) is the ratio of how often a B-meson (a bound state of a b-quark) decays into final states with a kaon (a bound state of an s-quark) plus two electrons vs final states with a kaon plus two muons. In the standard model there is a (somewhat mysterious) principle called lepton flavor universality which means that muons are just heavier versions of electrons. This principle implies B-mesons decays should produce electrons and muons equally and R(K) should be one. 

But previous measurements from LHCb had found R(K) to be less than one, with around 3σ of statistical evidence. Other LHCb measurements of B-mesons decays had also been showing similar hints of lepton flavor universality violation. This consistent pattern of deviations had not yet reached the significance required to claim a discovery. But it had led a good amount of physicists to become #cautiouslyexcited that there may be a new particle around, possibly interacting preferentially with muons and b-quarks, that was causing the deviation. Several hundred papers were written outlining possibilities of what particles could cause these deviations, checking whether their existence was constrained by other measurements, and suggesting additional measurements and experiments that could rule out or discover the various possibilities. 

This had all led to a considerable amount of anticipation for these updated results from LHCb. They were slated to be their final word on the anomaly using their full dataset collected during LHC’s 2nd running period of 2016-2018. Unfortunately what LHCb had discovered in this latest analysis was that they had made a mistake in their previous measurements.

There were additional backgrounds in their electron signal region which had not been previously accounted for. These backgrounds came from decays of B-mesons into pions or kaons which can be mistakenly identified as electrons. Backgrounds from mis-identification are always difficult to model with simulation, and because they are also coming from decays of B-mesons they produce similar peaks in their data as the sought after signal. Both these factors combined to make it hard to spot they were missing. Without accounting for these backgrounds it made it seem like there was more electron signal being produced than expected, leading to R(K) being below one. In this latest measurement LHCb found a way to estimate these backgrounds using other parts of their data. Once they were accounted for, the measurements of R(K) no longer showed any deviations, all agreed with one within uncertainties.

Plots showing two of the signal regions of for the electron channel measurements. The previously unaccounted for backgrounds are shown in lime green and the measured signal contribution is shown in red. These backgrounds have a peak overlapping with that of the signal, making it hard to spot that they were missing.

It is important to mention here that data analysis in particle physics is hard. As we attempt to test the limits of the standard model we are often stretching the limits of our experimental capabilities and mistakes do happen. It is commendable that the LHCb collaboration was able to find this issue and correct the record for the rest of the community. Still, some may be a tad frustrated that the checks which were used to find these missing backgrounds were not done earlier given the high profile nature of these measurements (their previous result claimed ‘evidence’ of new physics and was published in Nature).

Though the R(K) anomaly has faded away, the related set of anomalies that were thought to be part of a coherent picture (including another leptonic branching ratio R(D) and an angular analysis of the same B meson decay in to muons) still remain for now. Though most of these additional anomalies involve significantly larger uncertainties on the Standard Model predictions than R(K) did, and are therefore less ‘clean’ indications of new physics.

Besides these ‘flavor anomalies’ other hints of new physics remain, including measurements of the muon’s magnetic moment, the measured mass of the W boson and others. Though certainly none of these are slam dunk, as they each causes for skepticism.

So as we begin 2023, with a great deal of fresh LHC data expected to be delivered, particle physicists once again begin our seemingly Sisyphean task : to find evidence physics beyond the standard model. We know its out there, but nature is under no obligation to make it easy for us.

Paper: Test of lepton universality in b→sℓ+ℓ− decays (arXiv link)

Authors: LHCb Collaboration

Read More:

Excellent twitter thread summarizing the history of the R(K) saga

A related, still discrepant, flavor anomaly from LHCb

The W Mass Anomaly

What’s Next for Theoretical Particle Physics?

2022 saw the pandemic-delayed Snowmass process confront the past, present, and future of particle physics. As the last papers trickle in for the year, we review Snowmass’s major developments and takeaways for particle theory.

A team of scientists wanders through the landscape of questions. Generated by DALL·E 2.

It’s February 2022, and I am in an auditorium next to the beach in sunny Santa Barbara, listening to particle theory experts discuss their specialty. Each talk begins with roughly the same starting point: the Standard Model (SM) is incomplete. We know it is incomplete because, while its predictive capability is astonishingly impressive, it does not address a multitude of puzzles. These are the questions most familiar to any reader of popular physics: What is dark matter? What is dark energy? How can gravity be incorporated into the SM, which describes only 3 of the 4 known fundamental forces? How can we understand the origin of the SM’s structure — the values of its parameters, the hierarchy of its scales, and its “unnatural” constants that are calculated to be mysteriously small or far too large to be compatible with observation? 

This compilation of questions is the reason that I, and all others in the room, are here. In the 80s, the business of particle discovery was booming. Eight new particles had been confirmed in the past two decades alone, cosmology was pivoting toward the recently developed inflationary paradigm, and supersymmetry (SUSY) was — as the lore goes — just around the corner. This flourish of progress and activity in the field had become too extensive for any collaboration or laboratory to address on its own. Meanwhile, links between theoretical developments, experimental proposals, and the flurry of results ballooned. The transition from the solitary 18th century tinkerer to the CERN summer student running an esoteric simulation for a research group was now complete: particle physics, as a field and community, had emerged. 

It was only natural that the field sought a collective vision, or at minimum a notion of promising directions to pursue. In 1982, the American Physical Society’s Division of Particles and Fields organized Snowmass, a conference of a mere hundred participants that took place in a single room on a mountain in its namesake town of Snowmass, Colorado. Now, too large to be contained by its original location (although enthusiasm for organizing physics meetings at prominent ski locations abounds), Snowmass is both a conference and a multi-year process. 

The depth and breadth of particle physics knowledge acquired in the last half-century is remarkable, yet a snapshot of the field today appears starkly different. The Higgs boson just celebrated its tenth “discovery birthday”, and while the completion of the Standard Model (SM) as we know it is no doubt a momentous achievement, no new fundamental particles have been found since, despite overwhelming evidence of the inevitability of new physics. Supersymmetry may still prove to be just around the corner at a next-generation collider…or orders of magnitude beyond our current experimental reach. Despite attention-grabbing headlines that herald the “death” of particle physics, there remains an abundance of questions ripe for exploration. 

In light of this shift, the field is up against a challenge: how do we reconcile the disappointments of supersymmetry? Moreover, how can we make the case for the importance of fundamental physics research in an increasingly uncertain landscape?

The researchers are here at the Kavli Institute for Theoretical Physics (KITP) at UC Santa Barbara to map out the “Theory Frontier” of the Snowmass process. The “frontiers” — subsections of the field focusing on a different approach of particle physics — have met over the past year to weave their own story of the last decade’s progress, burgeoning both questions and promising future trajectories. This past summer, thousands of particle physicists across the frontiers convened in Seattle, Washington to share, debate, and ponder questions and new directions. Now, these frontiers are collating their stories into an anthology. 

Below are a few (theory) focal points in this astoundingly expansive picture.

Scattering Amplitudes

A 4-point amplitude can be constructed from two 3-point amplitudes. By Henriette Elvang.

Quantum field theory (QFT) is the common language of particle physics. QFT provides a description of a particle system based on two fundamental tools: the Lagrangian and the path integral, which can both be wrapped up in the familiar diagrams of Richard Feynman. This approach, utilizing a visualization of incoming and outgoing scattering or decaying particles, has provided relief to many Ph.D. students over the past few generations due to its comparative ease of use. The diagrams are roughly divided into three parts: propagators (which tell us about the motion of a free particle), vertices (in which three or more particles interact), and loops (which describe the scenario in which the trajectory of two particles form a closed path). They contain both real, incoming particles, which are known as on-shell, as well as virtual, intermediate particles that cannot be measured, which are known as off-shell. To calculate a scattering amplitude — the probability of one or more particles interacting to form some specified final state — in this paradigm requires summing over all possibilities of what these virtual particles may be. This can prove not only cumbersome, but can also result in redundancies in our calculations.

Particle theory, however, is undergoing a paradigm shift. If we instead focus on the physical observable itself, the scattering amplitude, we can build more complicated diagrams from simpler ones in a recursive fashion. For example, we can imagine creating a 4-particle amplitude by gluing together two 3-particle amplitudes, as shown above. The process bypasses the intermediate, virtual particles and focuses only on computing on-shell states. This is not only a nice feature, but it can significantly reduce the problem at hand: calculating the scattering amplitude of 8 gluons with the Feynman approach  requires computing more than a million Feynman diagrams, whereas the amplitudes method reduces the problem to a mere half-line. 

In recent years, this program has seen renewed efforts, not only for its practical simplicity but also for its insights into the underlying concepts that shape particle interactions. The Lagrangian formalism organizes a theory based on the fields it contains and the symmetries those fields obey, with the rule of thumb that any term respecting the theory’s symmetries can be included in the Lagrangian. Further, these terms satisfy several general principles: Unitarity (the sum of the probabilities of each possible process in the theory adds to one, and time-evolves in a respecting manner), causality (an effect originates only from a cause that is contained in the effect’s backward light cone), and locality (observables that are localized at distinct regions in spacetime cannot affect one another). These are all reasonable axioms, but they must be explicitly baked into a theory that is represented in the Lagrangian formalism. Scattering amplitudes, in contrast, can reveal these principles without prior assumptions, signaling the unveiling of a more fundamental structure.

Recent research surrounding amplitudes concerns both diving deeper into this structure, as well as applying the results of the amplitudes program toward precision theory predictions. The past decade has seen a flurry of results from an idea known as bootstrapping, which takes the relationship between physics and mathematics and flips it on its head.

QFTs are typically built up from “bottom-up” by including terms in the Lagrangian based on which fields are present and which symmetries they obey. The bootstrapping methodology instead asks what the observable quantities are that result from a theory, and considers which underlying properties they must obey in order to be mathematically consistent. This process of elimination rules out a large swath of possibilities, significantly constraining the system and allowing us to, in some cases, guess our way to the answer. 

This rich research program has plenty of directions to pursue. We can compute the scattering amplitudes of multi-loop diagrams in order to arrive at extremely precise SM predictions. We can probe their structure in the classical regime with the gravitational waves resulting from inspiraling stellar and black hole binaries. We can apply them to less understood regimes; for example, cosmological scattering amplitudes pose a unique challenge because they proceed in curved, rather than flat, space. Are there curved space analogues to the flat space amplitude structures? If so, what are they? What can we compute with them? Amplitudes are pushing forward our notion of what a QFT is. With them, we may be able to uncover the more fundamental frameworks that must underlie particle physics.

Computational Advances

The overlap between machine learning, deep learning, artificial intelligence, and physics. By Jesse Thaler.

Making theoretical predictions in the modern era has become incredibly computationally expensive. The Large Hadron Collider (LHC) and other accelerators produce over 100 terabytes of data per day while running, requiring not only intensive data filtering systems, but efficient computational methods to categorize and search for the signatures of particle collisions. Performing calculations in the quark sector — which relies on lattice gauge theory, in which spacetime is broken down into a discrete grid — also requires vast computational resources. And as simulations in astrophysics and cosmology balloon, so too does the supercomputing power needed to handle them. 

This challenge over the past decade has received a significant helping hand from the advent of machine learning — deep learning in particular. On the collider front, these techniques have been applied to the detection of anomalies — a deviation in the data from the SM “background” that may signal new physics — as well as the analysis of jets. These protocols can be trained on previously analyzed collider data and synthetic data to establish benchmarks and push computational efficiency much further. As the LHC enters its third operational run, it will be particularly focused on precision measurements as the increasing quantity of data allows for higher statistical certainty in our results. The growing list of anomalies — including the W mass measurement and the muon g-2 anomaly — will confront these increased statistics, allowing for possible confirmation or rejection of previous results. Our analyses have also grown more sophisticated; the showers of quarks and gluons that result from collisions of hadrons known as jets have proved to reveal substructure that opens up another avenue for comparison of data with an SM theory prediction. 

The quark sector especially will benefit from the growing adoption of machine learning in particle physics. Analytical calculations in this sector are intractable due to strong coupling, so in practice calculations are built upon the use of lattice gauge theory. Increasingly precise calculations are dependent upon making this grid smaller and smaller and including more and more particles. 

As physics continually benefits from the rapid development of machine learning and artificial intelligence, the field is up against a unique challenge. Machine learning algorithms can often be applied blindly, resulting in misunderstood outputs via a black box. The key in utilizing these techniques effectively is in asking the right questions, understanding what questions we are asking, and translating the physics appropriately to a machine learning context. This has its practical uses — in confidently identifying tasks for which automation is appropriate — but also opens up the possibility to formulate theoretical particle physics in a computational language. As we look toward the future, we can dream of the possibilities that could result from such a language: to what extent can we train a machine to learn physics itself? There is much work to be done before such questions can be answered, but the prospects are exciting nonetheless.

Cosmological Approaches

Shapes of possible non-Gaussian correlations in the distribution of galaxies. By Dan Green.

As the promise of SUSY is so far unfilled, the space of possible models is expansive, and anomalies pop up and disappear in our experiments, the field is yearning for a source of new data. While colliders have fueled significant progress in the past decades, a new horizon has formed with the launching of ultra-precise telescopes and gravitational wave detectors: probing the universe via cosmological data. 

The use of observations of astrophysical and cosmological sources to tell us about particle physics is not new, — we’ve long hunted for supernovae and mapped the cosmic microwave background (CMB) — but nascent theory developments hold incredible potential for discovery. As of 2015, observations of gravitational waves guide insights into stellar and black hole binaries, with an eye toward a detection of a stochastic gravitational wave background originating from the period of exponential expansion known as inflation, which proceeded shortly after the big bang. The observation of black hole binaries in particular can provide valuable insights into the workings of gravity at the smallest of scales, when it enters the realm of quantum mechanics. The possibility of a stochastic gravitational wave background raises the promise of “seeing” the workings of the universe at earlier stages in its history than we’ve ever before been able to access, potentially even to the start of the universe itself. 

Inflation also lends itself to other applications within particle physics. Quantum fields at the beginning of the universe, in alignment with the uncertainty principle, predict tiny, statistical fluctuations. These initial spacetime curvature perturbations beget density perturbations in the distribution of matter which beget the temperature fluctuations visible in the cosmic microwave background (CMB). These fluctuations are observed to be distributed according to a Gaussian normal distribution, as of the latest CMB datasets. But tiny, primordial non-Gaussianities — processes that lead to a correlated pattern of fluctuations in the CMB and other datasets — are predicted for certain particle interactions during inflation. In particular, if particles interacting with the fields responsible for inflation acquire heavy masses during inflation, they could imprint a distinct, oscillating signature within these datasets. This would show up in our observables, such as the large-scale distribution of galaxies shown above, in the form of triangular (or higher-point polygonal) shapes signaling a non-Gaussian correlation. Currently, our probes of these non-Gaussianities are not precise enough to unveil such signatures, but planned and upcoming experiments may establish this new window into the workings of the early universe. 

Finally, a section on the intersections of cosmology and particle physics would not be complete without mention of everyone’s favorite mystery: dark matter. A decade ago, the prime candidate for dark matter was the WIMP — the Weakly Interacting Massive Particle. This model was fairly simple, able to account for the 25% dark matter content of the universe we observe today, and remain in harmony with all other known cosmology. However, we’ve now probed a large swath of possible masses and cross-sections for the WIMP and come up short. The field’s focus has shifted to a different candidate for dark matter, the axion, which addresses both the dark matter mystery and a puzzle known as the strong CP problem simultaneously. While experiments to probe the axion parameter space are built, theorists are tasked with identifying well-motivated regions of this space — that is, possibilities for the mass and other parameters describing the axion that are plausible. The prospects include: theoretical motivation from calculations in string theory, considerations of the Peccei-Quinn symmetry underlying the notion of an axion, as well as various possible modes of production, including extreme astrophysical environments such as neutron stars and black holes. 

Cosmological data has thus far been an important source not only into the history and evolution of the universe, but also of particle physics at high energy scales. As new telescopes and gravitational wave observatories are slated to come online within the next decade, expect this prolific field to continue to deliver alluring prospects for physics beyond the SM.

Neutrinos

A visual representation of how neutrino oscillation works. From: http://www.hyper-k.org/en/neutrino.html.

While the previous sections have highlighted new approaches to uncovering physics beyond the SM, there is a particular collection of particles that stand out in the spotlight. In the SM formulation, the three flavors of neutrinos are massless, just like the photon. Yet we know unequivocally from experiment that this is false. Neutrinos display a phenomenon known as neutrino mixing, in which one flavor of neutrino can turn into another flavor of neutrino as it propagates. This implies that at least two of the three neutrino flavors are in fact massive

Investigating why neutrinos have mass — and where that mass comes from — is a central question in particle physics. Neutrinos are especially captivating because any observation of a neutrino mass mechanism is guaranteed to be a window to new physics. Further, neutrinos could be of central importance to several puzzles within the SM, including the MicroBooNE anomaly, the question of why there is more matter than antimatter in the observable universe, and the flavor puzzle, among others. The latter refers to an overall lack of understanding of the origin of flavor in the SM. Why do quarks come in six flavors, organized into three generations each consisting of one “up-type” quark with a +⅔ charge and one “down-type” quarks with a -⅓ charge? Why do leptons come in six flavors, with three pairs of one electron-like particle and one neutrino? What is the origin of the hierarchy of masses for both quarks and leptons? Of the SM’s 19 free parameters — which includes particle masses, coupling strengths, and others — 14 of them are associated with flavor. 

The unequivocal evidence for neutrino mixing was the crown prize of the last few decades of neutrino physics research. Modern experiments are charged with detecting more subtle signs of new physics, through measurements of neutrino energy in colliders, ever more precise oscillation data, and the possibility for a heavy neutrino belonging to a fourth generation. 

Experiment has a clear role to play; the upcoming Deep Underground Neutrino Experiment (DUNE) will produce neutrinos at Fermilab and observe them at Sanford Lab, South Dakota in order to accumulate data regarding long-distance neutrino oscillation. DUNE and other detectors will also turn their eye toward the sky in observations of neutrinos sourced by supernovae. There is also much room for theorists, both in developing models for neutrino mass-generation as well as influencing the future of neutrino experiment — short-distance neutrino oscillation experiments are a key proposal in the quest to address the MicroBooNE anomaly. 

The field of neutrino physics is only growing. It is likely we’ll learn much more about the SM and beyond through these ghostly, mysterious particles in the coming decades.

Which Collider(s)?

The proposed location, right over the LHC, of the Future Circular Collider (FCC), one of the many options for a next-generation collider. From: CERN.

One looming question has formed an undercurrent through the entirety of the Snowmass process: What’s next after the LHC? In the past decade, propositions have been fleshed out in various stages, with the goal of satisfying some part of the lengthy wish list of questions a future collider would hope to probe. 

The most well-known possible successor to the LHC is the Future Circular Collider (FCC), which is roughly a plan for a larger LHC, able to reach energies some 30 times that of its modern-day counterpart. An FCC that collides hadrons, as the LHC does, would extend the reach of our studies into the Higgs boson, other force-carrying gauge bosons, and dark matter searches. Its higher collision rate would enable studies of rare hadron decays and continue the trek into the realm of flavor physics searches. It would also enable our discovery of gauge bosons of new interactions — if they exist at those energies. This proposal, while captivating, has also met its fair share of skepticism, particularly because there is no singular particle physics goal it would be guaranteed to achieve. When the LHC was built, physicists were nearly certain that the Higgs boson would be found there — and it was. However, physicists were also somewhat confident in the prospect of finding SUSY at the LHC. Could supersymmetric particles be discovered at the FCC? Maybe, or maybe not. 

A second plan exists for the FCC, in which it collides electrons and positrons instead of hadrons. This targets the electroweak sector of the SM, covering the properties of the Higgs, the W and Z bosons, and the heaviest quark (the top quark). Whereas hadrons are composite particles, and produce particle showers and jets upon collision, leptons are fundamental particles, and so have well-defined initial states. This allows for greater precision in measurements compared to hadron colliders, particularly in questions of the Higgs. Is the Higgs boson the only Higgs-like particle? Is it a composite particle? How does the origin of mass influence other key questions, such as the nature of dark matter? While unable to reach as high of energies as a hadron collider, an electron-positron collider is appealing due to its precision. This dichotomy epitomizes the choice between these two proposals for the FCC.

The options go beyond circular colliders. Linear colliders such as the International Linear Collider (ILC) and Compact Linear Collider (CLIC) are also on the table. While circular colliders are advantageous for their ability to accelerate particles over long distances and to keep un-collided particles in circulation for other experiments, they come with a particular disadvantage due to their shape. The acceleration of charged particles along a curved path results in synchrotron radiation — electromagnetic radiation that significantly reduces the energy available for each collision. For this reason, a circular accelerator is more suited to the collision of heavy particles — like the protons used in the LHC — than much lighter leptons. The lepton collisions within a linear accelerator would produce Higgs bosons at a high rate, allowing for deeper insight into the multitude of Higgs-related questions.

In the past few years, interest has grown for a different kind of lepton collider: a muon collider. Muons are, like electrons, fundamental particles, and therefore much cleaner in collisions than composite hadrons. They are also much more massive than electrons, which leads to a smaller proportion of energy being lost to synchrotron radiation in comparison to electron-positron colliders. This would allow for both high-precision measurements as well as high energies, making a muon collider an incredibly attractive candidate. The heavier mass of the muon, however, does bring with it a new set of technical challenges, particularly because the muon is not a stable particle and decays within a short timeframe. 

As a multi-billion dollar project requiring the cooperation of numerous countries, getting a collider funded, constructed, and running is no easy feat. As collider proposals are put forth and debated, there is much at stake — a future collider will also determine the research programs and careers of many future students and professors. With that in mind, considerable care is necessary. Only one thing is certain: there will be something after the LHC.

Toward the Next Snowmass

The path forward in the quest to understand particle physics. By Raman Sundrum.

The above snapshots are only a few of the myriad subtopics within particle theory; other notable ones include string theory, quantum information science, lattice gauge theory, and effective field theory. The full list of contributed papers can be found here. 

As the Snowmass process wraps up, the voice of particle theory has played and continues to play an influential role. Overall, progress in theory remains more accessible than in experiment —  the number of possible models we’ve developed far outpaces the detectors we are able to build to investigate them. The theoretical physics community both guides the direction and targets of future experiments, and has plenty of room to make progress on the model-building front, including understanding quantum field theories at the deepest level and further uncovering the structures of amplitudes. A decade ago, SUSY at LHC-scales was at the prime objective in the hunt for an ultimate theory of physics. Now, new physics could be anywhere and everywhere; Snowmass is crucial to charting our path in an endless valley of questions. I look forward to the trails of the next decade.

The LHC is on turning on again! What does that mean?

Deep underground, on the border between Switzerland and France, the Large Hadron Collider (LHC) is starting back up again after a 4 year hiatus. Today, July 5th, the LHC had its first full energy collisions since 2018.  Whenever the LHC is running is exciting enough on its own, but this new run of data taking will also feature several upgrades to the LHC itself as well as the several different experiments that make use of its collisions. The physics world will be watching to see if the data from this new run confirms any of the interesting anomalies seen in previous datasets or reveals any other unexpected discoveries. 

New and Improved

During the multi-year shutdown the LHC itself has been upgraded. Noticably the energy of the colliding beams has been increased, from 13 TeV to 13.6 TeV. Besides breaking its own record for the highest energy collisions every produced, this 5% increase to the LHC’s energy will give a boost to searches looking for very rare high energy phenomena. The rate of collisions the LHC produces is also expected to be roughly 50% higher  previous maximum achieved in previous runs. At the end of this three year run it is expected that the experiments will have collected twice as much data as the previous two runs combined. 

The experiments have also been busy upgrading their detectors to take full advantage of this new round of collisions.

The ALICE experiment had the most substantial upgrade. It features a new silicon inner tracker, an upgraded time projection chamber, a new forward muon detector, a new triggering system and an improved data processing system. These upgrades will help in its study of exotic phase of matter called the quark gluon plasma, a hot dense soup of nuclear material present in the early universe. 

 

A diagram showing the various upgrades to the ALICE detector (source)

ATLAS and CMS, the two ‘general purpose’ experiments at the LHC, had a few upgrades as well. ATLAS replaced their ‘small wheel’ detector used to measure the momentum of muons. CMS replaced the inner most part its inner tracker, and installed a new GEM detector to measure muons close to the beamline. Both experiments also upgraded their software and data collection systems (triggers) in order to be more sensitive to the signatures of potential exotic particles that may have been missed in previous runs. 

The new ATLAS ‘small wheel’ being lowered into place. (source)

The LHCb experiment, which specializes in studying the properties of the bottom quark, also had major upgrades during the shutdown. LHCb installed a new Vertex Locator closer to the beam line and upgraded their tracking and particle identification system. It also fully revamped its trigger system to run entirely on GPU’s. These upgrades should allow them to collect 5 times the amount of data over the next two runs as they did over the first two. 

Run 3 will also feature a new smaller scale experiment, FASER, which will study neutrinos produced in the LHC and search for long-lived new particles

What will we learn?

One of the main goals in particle physics now is direct experimental evidence of a phenomena unexplained by the Standard Model. While very successful in many respects, the Standard Model leaves several mysteries unexplained such as the nature of dark matter, the imbalance of matter over anti-matter, and the origin of neutrino’s mass. All of these are questions many hope that the LHC can help answer.

Much of the excitement for Run-3 of the LHC will be on whether the additional data can confirm some of the deviations from the Standard Model which have been seen in previous runs.

One very hot topic in particle physics right now are a series of ‘flavor anomalies‘ seen by the LHCb experiment in previous LHC runs. These anomalies are deviations from the Standard Model predictions of how often certain rare decays of the b quarks should occur. With their dataset so far, LHCb has not yet had enough data to pass the high statistical threshold required in particle physics to claim a discovery. But if these anomalies are real, Run-3 should provide enough data to claim a discovery.

A summary of the various measurements making up the ‘flavor anomalies’. The blue lines and error bars indicate the measurements and their uncertainties. The yellow line and error bars indicates the standard model predictions and their uncertainties. Source

There are also a decent number ‘excesses’, potential signals of new particles being produced in LHC collisions, that have been seen by the ATLAS and CMS collaborations. The statistical significance of these excesses are all still quite low, and many such excesses have gone away with more data. But if one or more of these excesses was confirmed in the Run-3 dataset it would be a massive discovery.

While all of these anomalies are gamble, this new dataset will also certainly be used to measure various known entities with better precision, improving our understanding of nature no matter what. Our understanding of the Higgs boson, the top quark, rare decays of the bottom quark, rare standard model processes, the dynamics of the quark gluon plasma and many other areas will no doubt improve from this additional data.

In addition to these ‘known’ anomalies and measurements, whenever an experiment starts up again there is also the possibility of something entirely unexpected showing up. Perhaps one of the upgrades performed will allow the detection of something entirely new, unseen in previous runs. Perhaps FASER will see signals of long-lived particles missed by the other experiments. Or perhaps the data from the main experiments will be analyzed in a new way, revealing evidence of a new particle which had been missed up until now.

No matter what happens, the world of particle physics is a more exciting place when the LHC is running. So lets all cheers to that!

Read More:

CERN Run-3 Press Event / Livestream Recording “Join us for the first collisions for physics at 13.6 TeV!

Symmetry Magazine “What’s new for LHC Run 3?

CERN Courier “New data strengthens RK flavour anomaly

A Massive W for CDF

This is part two of our coverage of the CDF W mass measurement, discussing how the measurement was done. Read about the implications of this result in our sister post here

Last week, the CDF collaboration announced the most precise measurement of the W boson’s mass to date. After nearly ten years of careful analysis, the W weighed in at 80,433.5 ± 9.4 MeV: a whopping seven standard deviations away from the Standard Model expectation! This result quickly became the talk of the town among particle physicists, and there are already dozens of arXiv papers speculating about what it means for the Standard Model. One of the most impressive and hotly debated aspects of this measurement is its high precision, which came from an extremely careful characterization of the CDF detector and recent theoretical developments in modeling proton structure. In this post, I’ll describe how they made the measurement and the clever techniques they used to push down the uncertainties.

The new CDF measurement of the W boson mass. The center of the red ellipse corresponds to the central values of the measured W mass (y-coordinate) and top quark mass (x-coordinate, from other experiments). The purple line shows the Standard Model constraint on the W mass as a function of the top mass, and the border of the red ellipse is the one standard deviation boundary around the measurement.

The imaginatively titled “Collider Detector at Fermilab” (CDF) collected proton-antiproton collision data at Fermilab’s Tevatron accelerator for over 20 years, until the Tevatron shut down in 2011. Much like ATLAS and CMS, CDF is made of cylindrical detector layers, with the innermost charged particle tracker and adjacent electromagnetic calorimeter (ECAL) being most important for the W mass measurement. The Tevatron ran at a center of mass energy of 1.96 TeV — much lower than the LHC’s 13 TeV — which enabled a large reduction in the “theoretical uncertainties” on the measurement. Physicists use models called “parton distribution functions” (PDFs) to calculate how a proton’s momentum is distributed among its constituent quarks, and modern PDFs make very good predictions at the Tevatron’s energy scale. Additionally, W boson production in proton-antiproton collisions doesn’t involve any gluons, which are a major source of uncertainty in PDFs (LHC collisions are full of gluons, making for larger theory uncertainty in LHC W mass measurements).

A cutaway view of the CDF detector. The innermost tracking detector (yellow) reconstructs the trajectories of charged particles, and the nearby electromagnetic calorimeter (red) collects energy deposits from photons and charged particles (e.g. electrons). The tracker and EM Cal were both central in the W mass measurement.

Armed with their fancy PDFs, physicists set out to measure the W mass in the same way as always: by looking at its decay products! They focused on the leptonic channel, where the W decays to a lepton (electron or muon) and its associated neutrino. This clean final state is easy to identify in the detector and allows for a high-purity, low-background signal selection. The only sticking point is the neutrino, which flies out of the detector completely undetected. Thankfully, momentum conservation allowed them to reconstruct the neutrino’s transverse momentum (pT) from the rest of the visible particles produced in the collision. Combining this with the lepton’s measured momentum, they reconstructed the “transverse mass” of the W — an important observable for estimating its true mass.

A leptonic decay of the W boson, where it decays to an electron and an electron antineutrino. This channel, along with the muon + muon antineutrino channel, formed the basis of CDF’s W mass measurement.

Many of the key observables for this measurement flow from the lepton’s momentum, which means it needs to be measured very carefully! The analysis team calibrated their energy and momentum measurements by using the decays of other Standard Model particles: the ϒ(1S) and J/ψ mesons, and the Z boson. These particles’ masses are very precisely known from other experiments, and constraints from these measurements helped physicists understand how accurately CDF reconstructs a particle’s energy. For momentum measurements in the tracker, they reconstructed the ϒ(1S) and J/ψ masses from their decays to muon-antimuon pairs inside CDF, and compared CDF-measured masses to their known values from other experiments. This allowed them to calculate a correction factor to apply to track momenta. For ECAL energy measurements, they looked at samples of Z and W bosons decaying to electrons, and measured ratio of energy deposited in the ECAL (E) to the momentum measured in the tracker (p). The shape of the E/p distribution then allowed them to calculate an energy calibration for the ECAL.

Left: the fractional deviation of the measured muon momentum relative to its true momentum (y-axis), as a function of the muon’s average inverse transverse momentum. Data from ϒ(1S), J/ψ, and Z decays are shown, and the fit line (in black) has a slope consistent with zero. This indicates that there is no significant mismodeling of the energy lost by a particle flying through the detector. Right: the distribution of the ratio energy measured in the ECAL to momentum measured in the tracker. The shape of the peak and tail are used to calibrate the ECAL energy measurements.

To make sure their tracker and ECAL calibrations worked correctly, they applied them in measurements of the Z boson mass in the electron and muon decay channels. Thankfully, their measurements were consistent with the world average in both channels, providing an important cross-check of their calibration strategy.

Having done everything humanly possible to minimize uncertainties and calibrate their measurements, the analysis team was finally ready to measure the W mass. To do this, they simulated W boson events with many different settings for the W mass (an additional mountain of effort went into ensuring that the simulations were as accurate as possible!). At each mass setting, they extracted “template” distributions of the lepton pT, neutrino pT, and W boson transverse mass, and fit each template to the distribution measured in real CDF data. The templates that best fit the measured data correspond to CDF’s measured value of the W mass (plus some additional legwork to calculate uncertainties)

The reconstructed W boson transverse mass distribution in the muon + muon antineutrino decay channel. The best-fit template (red) is plotted along with the background distribution (gray) and the measured data (black points).

After years of careful analysis, CDF’s measurement of mW = 80,433.5 ± 9.4 MeV sticks out like a sore thumb. If it stands up to the close scrutiny of the particle physics community, it’s further evidence that something new and mysterious lies beyond the Standard Model. The only way to know for sure is to make additional measurements, but in the meantime we’ll all be happily puzzling over what this might mean.

CDF’s W mass measurement (bottom), shown alongside results from other experiments and the SM expectation (gray).

Read More

Quanta Magazine’s coverage of the measurement

A recorded talk from the Fermilab Wine & Cheese seminar covering the result in great detail

Too Massive? New measurement of the W boson’s mass sparks intrigue

This is part one of our coverage of the CDF W mass result covering its implications. Read about the details of the measurement in a sister post here!

Last week the physics world was abuzz with the latest results from an experiment that stopped running a decade ago. Some were heralding this as the beginning of a breakthrough in fundamental physics, headlines read “Shock result in particle experiment could spark physics revolution” (BBC). So what exactly is all the fuss about?

The result itself is an ultra-precise measurement of the mass of the W boson. The W boson is one of the carriers of weak force and this measurement pegged its mass at 80,433 MeV with an uncertainty of 9 MeV. The excitement is coming because this value disagrees with the prediction from our current best theory of particle physics, the Standard Model. In theoretical structure of the Standard Model the masses of the gauge bosons are all interrelated. In the Standard Model the mass of the W boson can be computed based on the mass of the Z as well as few other parameters in the theory (like the weak mixing angle). In a first approximation (ie to the lowest order in perturbation theory), the mass of the W boson is equal to the mass of the Z boson times the cosine of the weak mixing angle. Based on other measurements that have been performed including the Z mass, the Higgs mass, the lifetime of muons and others, the Standard Model predicts that the mass of the W boson should be 80,357 (with an uncertainty of 6 MeV). So the two numbers disagree quite strongly, at the level of 7 standard deviations.

If the measurement and the Standard Model prediction are both correct, this would imply that there is some deficiency in the Standard Model; some new particle interacting with the W boson whose effects haven’t been unaccounted for. This would be welcome news to particle physicists, as we know that the Standard Model is an incomplete theory but have been lacking direct experimental confirmation of its deficiencies. The size of the discrepancy would also mean that whatever new particle was causing the deviation may also be directly detectable within our current or near future colliders.

If this discrepancy is real, exactly what new particles would this entail? Judging based on the 30+ (and counting) papers released on the subject in the last week, there are a good number of possibilities. Some examples include extra Higgs bosons, extra Z-like bosons, and vector-like fermions. It would take additional measurements and direct searches to pick out exactly what the culprit was. But it would hopefully give experimenters definite targets of particles to look for, which would go a long way in advancing the field.

But before everyone starts proclaiming the Standard Model dead and popping champagne bottles, its important to take stock of this new CDF measurement in the larger context. Measurements of the W mass are hard, that’s why it has taken the CDF collaboration over 10 years to publish this result since they stopped taking data. And although this measurement is the most precise one to date, several other W mass measurements have been performed by other experiments.

The Other Measurements

A plot summarizing the various W mass measurements performed to date
A summary of all the W mass measurements performed to date (black dots) with their uncertainties (blue bars) as compared to the the Standard Model prediction (yellow band). One can see that this new CDF result is in tension with previous measurements. (source)

Previous measurements of the W mass have come from experiments at the Large Electron-Positron collider (LEP), another experiment at the Tevatron (D0) and experiments at the LHC (ATLAS and LHCb). Though none of these were as precise as this new CDF result, they had been painting a consistent picture of a value in agreement with the Standard Model prediction. If you take the average of these other measurements, their value differs from the CDF measurement the level about 4 standard deviations, which is quite significant. This discrepancy seems large enough that it is unlikely to arise from purely random fluctuation, and likely means that either some uncertainties have been underestimated or something has been overlooked in either the previous measurements or this new one.

What one would like are additional, independent, high precision measurements that could either confirm the CDF value or the average value of the previous measurements. Unfortunately it is unlikely that such a measurement will come in the near future. The only currently running facility capable of such a measurement is the LHC, but it will be difficult for experiments at the LHC to rival the precision of this CDF one.

W mass measurements are somewhat harder at the LHC than the Tevatron for a few reasons. First of all the LHC is proton-proton collider, while the Tevatron was a proton-antiproton collider, and the LHC also operates at a higher collision energy than the Tevatron. Both differences cause W bosons produced at the LHC to have more momentum than those produced at the Tevatron. Modeling of the W boson’s momentum distribution can be a significant uncertainty of its mass measurement, and the extra momentum of W’s at the LHC makes this a larger effect. Additionally, the LHC has a higher collision rate, meaning that each time a W boson is produced there are actually tens of other collisions laid on top (rather than only a few other collisions like at the Tevatron). These extra collisions are called pileup and can make it harder to perform precision measurements like these. In particular for the W mass measurement, the neutrino’s momentum has to be inferred from the momentum imbalance in the event, and this becomes harder when there are many collisions on top of each other. Of course W mass measurements are possible at the LHC, as evidenced by ATLAS and LHCb’s already published results. And we can look forward to improved results from ATLAS and LHCb as well as a first result from CMS. But it may be very difficult for them to reach the precision of this CDF result.

A histogram of the transverse mass of the W from the ATLAS result. Showing how 50 MeV shifts in the W mass change the spectrum by extremely small amounts (a few tenths of a percent).
A plot of the transverse mass (one of the variables used in a measurement) of the W from the ATLAS measurement. The red and yellow lines show how little the distribution changes if the W mass changes by 50 MeV, which is around two and half times the uncertainty of the ATLAS result. These shifts change the distribution by only a few tenths of a percent, illustrating the difficulty involved. (source)

The Future

A future electron positron collider would be able to measure the W mass extremely precisely by using an alternate method. Instead of looking at the W’s decay, the mass could be measured through its production, by scanning the energy of the electron beams very close to the threshold to produce two W bosons. This method should offer precision significantly better than even this CDF result. However any measurement from a possible future electron positron collider won’t come for at least a decade.

In the coming months, expect this new CDF measurement to receive a lot buzz. Experimentalists will be poring over the details trying to figure out why it is in tension with previous measurements and working hard to produce new measurements from LHC data. Meanwhile theorists will write a bunch of papers detailing the possibilities of what new particles could explain the discrepancy and if there is a connection to other outstanding anomalies (like the muon g-2). But the big question of whether we are seeing the first real crack in the Standard Model or there is some mistake in one or more of the measurements is unlikely to be answered for a while.

If you want to learn about how the measurement actually works, check out this sister post!

Read More:

Cern Courier “CDF sets W mass against the Standard Model

Blog post on the CDF result from an (ATLAS) expert on W mass measurements “[Have we] finally found new physics with the latest W boson mass measurement?”

PDG Review “Electroweak Model and Constraints on New Physics