The LHCb collaboration has measured a difference in mass between two particles of 0.00000000000000000000000000000000000001 grams – or, in scientific notation, 10-38 g. The result, reported in a paper just submitted for publication in the journal Physical Review Letters and presented today at a CERN seminar, marks a milestone in the study of how a particle known as a D0 meson changes from matter into antimatter and back.
The D0 meson is one of only four particles in the Standard Model of particle physics that can turn, or “oscillate”, into their antimatter particles, which are identical to their matter counterparts in most ways. The other three are the K0 meson and two types of B mesons.
Mesons are part of the large class of particles made up of fundamental particles called quarks, and contain one quark and one antimatter quark. The D0 meson consists of a charm quark and an up antiquark, while its antiparticle, the anti-D0, consists of a charm antiquark and an up quark.
In the strange world of quantum physics, just as Schrödinger's notorious cat can be dead and alive at the same time, the D0 particle can be itself and its antiparticle at once. This quantum “superposition” results in two particles, each with their own mass – a lighter and a heavier D meson (known technically as D1 and D2). It is this superposition that allows the D0 to oscillate into its antiparticle and back.
The D0 particles are produced in proton–proton collisions at the Large Hadron Collider (LHC), and they travel on average only a few millimetres before transforming, or “decaying”, into other particles. By comparing the D0 particles that decay after travelling a short distance with those that travel a little further, the LHCb collaboration has measured the key quantity that controls the speed of the D0 oscillation into anti-D0 – the difference in mass between the heavier and lighter D particles.
The result, 10-38 g, crosses the “five sigma” level of statistical significance that is required to claim an observation in particle physics.
“To put this incredibly small mass difference in context, it is still a small number even when compared with the mass of the D0 particle – the same as the mass of a snowball compared to the mass of the entire Mont Blanc, the highest peak in Europe, standing at over 4800 metres,” says LHCb spokesperson Chris Parkes. “And it’s a big step in the study of the oscillatory behaviour of the D0 particles.”
With the tiny mass difference now observed, a new phase of particle exploration can begin. Researchers can make further measurements of the D0 decays to obtain a more precise mass difference and look for the effect on the D0 oscillation of unknown particles not predicted by the Standard Model.
Such new particles could increase the average speed of the oscillation or the difference between the speed of the matter-to-antimatter oscillation and that of the antimatter-to-matter oscillation. If observed, such a difference could shed light on why the universe is made up entirely of matter, even though matter and antimatter should have been created in equal amounts during the Big Bang.LHCb spokesperson Chris Parkes explains the new result. (Video: CERN)
Read more on the LHCb website.
It’s a first at the Large Hadron Collider (LHC), or indeed at any particle collider: the FASER collaboration has detected the first candidate particle interactions for neutrinos produced in LHC collisions. The result, described in a paper posted online, paves the way for studies of high-energy neutrinos at current and future colliders.
Neutrinos are the most abundant fundamental particles that have mass in the universe, and they have been detected from many sources. Yet, no neutrino produced at a particle collider has ever been directly detected, even though colliders produce them in abundance. Studying such collider neutrinos could shed new light on the still enigmatic nature of these fundamental particles, not least because collider neutrinos are produced at high energies, at which their weak interactions with matter have been little studied.
The FASER experiment’s FASERν detector and the newly approved SND@LHC detector have both been designed to catch and study collider neutrinos, and they are expected to be installed at the LHC over the course of 2021 and to begin taking data when the collider starts up again in 2022. However, the FASER collaboration was in for an early treat when it took four weeks’ worth of proton–proton collision data with a smaller pilot version of FASERν shortly before the LHC was shut down for maintenance and upgrades at the end of 2018.
After analysing the pilot detector data and estimating a background of particle events that could mimic the signal from neutrino interactions, the FASER team found several candidate events for collider neutrinos. The result has a statistical significance of 2.7 standard deviations, a little below the 3 standard deviations required to claim evidence of a particle or process in particle physics.
“The goal of the pilot detector was to demonstrate the feasibility of neutrino measurements in the experimental environment of the LHC,” says FASER co-spokesperson Jamie Boyd. “So we are very excited that this small detector, which is only about 1% of the final detector, allowed us to see the first candidate events for neutrino interactions at a collider.”
The team expects to observe about 20 000 collider neutrino interactions with the full-fledged FASERν detector in the next LHC run, from 2022 to 2024.Two candidate events for neutrinos produced in LHC collisions and interacting in the FASERν pilot detector. The neutrinos enter the detector from the left, and interact with the detector material to produce a number of charged particles. The different lines in each event show tracks from these charged particles, originating from the neutrino interaction point. (Image: FASER/CERN)
Long-hypothesised particles called axions could solve two problems in one strike: they could explain the puzzling symmetry properties of the strong force and they could make up the mysterious dark matter that permeates the cosmos. One of the newest detectors of the CAST experiment at CERN, RADES, has now joined the worldwide hunt for axions, searching for axions from the Milky Way’s “halo” of dark matter and setting a limit on the strength of their interaction with photons. The results are described in a paper submitted for publication in the Journal of High Energy Physics.
One way of searching for axions from the Milky Way’s dark-matter halo is to look for their conversion into photons in a “resonating cavity”. If such axions surround and enter a resonating cavity that is placed in a strong magnetic field and resonates at a frequency corresponding to their mass, the chances of detecting them through their conversion into photons are increased.
Many experiments have used this search method and set limits on the interaction strength of axions with two photons in the case of small axion masses, mainly below 25 µeV (for comparison, the proton mass is 1 GeV). Searching for larger axion masses using this approach requires a smaller cavity resonating at a higher frequency, but the smaller volume of a smaller cavity decreases the chances of spotting the particles.
A workaround involves dividing the cavity into smaller cavities that resonate at a higher frequency and collectively don’t result in a loss of cavity volume. This is exactly the concept behind the RADES detector, which was installed inside one of CAST’s dipole magnet bores in 2018 and can search for axions from the Milky Way’s dark-matter halo that have a mass of around 34.67 µeV.
Researchers are developing complementary approaches to searching for axions, and some have searched for larger-mass axions using new cavity designs and placed limits on their interaction strength with two photons. But the best limit so far for an axion mass of 34.67 µeV was placed by CAST’s previous searches for axions from the Sun.
In its latest paper, the CAST team describes the results of the first RADES search for axions. Sifting through data taken for more than 100 hours within a period of 20 days in 2018, the team saw no signs of axions. However, the data places a limit on the interaction strength of axions with two photons in the case of axions with a mass of or close to 34.67 µeV – a limit that is more than 100 times more stringent than CAST’s previous best limit for this mass.
“This result is a significant first step in the direct search for axions using dipole magnets,” says RADES scientist Sergio Arguedas Cuendis. “And as far as axion searches go, it’s one of the most stringent limits ever set for axions with masses above 25 µeV.”
The ATLAS and CMS experiments at the Large Hadron Collider (LHC) have performed luminosity measurements with spectacular precision. A recent physics briefing from CMS complements earlier ATLAS results and shows that by combining multiple methods, both experiments have reached a precision better than 2%. For physics analyses – such as searches for new particles, rare processes or measurements of the properties of known particles – it is not only important for accelerators to increase luminosity, but also for physicists to understand it with the best possible precision.
Luminosity is one of the fundamental parameters to measure an accelerator’s performance. In the LHC, the circulating beams of protons are not continuous beams but are grouped into packets, or “bunches”, of about 100 billion protons. These bunches collide with oncoming bunches 40 million times per second at the interaction points within particle detectors. But when two such bunches pass through each other, only a few protons from each bunch end up interacting with the protons circulating in the opposite direction. Luminosity is a measure of the number of these interactions. Two main aspects of luminosity are instantaneous luminosity, describing the number of collisions happening in a unit of time (for example every second), and integrated luminosity, measuring the total number of collisions produced over a period of time.
Integrated luminosity is usually expressed in units of “inverse femtobarns” (fb-1). A femtobarn is a unit of cross-section, a measure of the probability for a process to occur in a particle interaction. This is best illustrated with an example: the total cross-section for Higgs boson production in proton–proton collisions at 13 TeV at the LHC is of the order of 6000 fb. This means that every time the LHC delivers 1 fb-1 of integrated luminosity, about 6000 fb x 1 fb-1 = 6000 Higgs bosons are produced.
Knowing the integrated luminosity allows physicists to compare observations with theoretical predictions and simulations. For example, physicists can look for dark matter particles that escape collisions undetected by looking at energies and momenta of all particles produced in a collision. If there is an imbalance, it could be caused by an undetected, potentially dark matter, particle carrying energy away. This is a powerful method of searching for a large class of new phenomena, but it has to take into account many effects, such as neutrinos produced in the collisions. Neutrinos also escape undetected and leave an energy imbalance, so in principle, they are indistinguishable from the new phenomena. To see if something unexpected has been produced, physicists have to look at the numbers.
So if 11000 events show an energy imbalance, and the simulations predict 10000 events containing neutrinos, this could be significant. But if physicists only know luminosity with a precision of 10%, they could have easily had 11000 neutrino events, but there were just 10% more collisions than assumed. Clearly, a precise determination of luminosity is critical.
There are also types of analyses that depend much less on absolute knowledge of the number of collisions. For example, in measurements of ratios of different particle decays, such as the recent LHCb measurement. Here, uncertainties in luminosity get cancelled out in the ratio calculations. Other searches for new particles look for peaks in mass distribution and so rely more on the shape of the observed distribution and less on the absolute number of events. But these also need to know the luminosity for any kind of interpretation of the results.
Ultimately, the greater the precision of the luminosity measurement, the more physicists can understand their observations and delve into hidden corners beyond our current knowledge.
Established in 2019 with its central hub at CERN, the European Consortium for Astroparticle Theory (EuCAPT) aims to bring together the European community of theoretical astroparticle physicists and cosmologists to tackle some of the greatest mysteries in science.
There are strong hints that explanations for dark matter and dark energy, the origin of high-energy cosmic rays, the matter–antimatter asymmetry, and other enigmas about the universe at large lie in the domain of particle physics. Addressing them therefore demands a highly interdisciplinary approach by a strong and diverse community.
"Astroparticle physics is undergoing a phase of profound transformation", says EuCAPT Director Gianfranco Bertone of the Centre for Gravitation and Astroparticle Physics at the University of Amsterdam. "We have recently obtained extraordinary results, such as the discovery of high-energy cosmic neutrinos with IceCube and the direct detection of gravitational waves with LIGO and Virgo, and we have witnessed the birth of multi-messenger astrophysics. Yet we have formidable challenges ahead of us."
The symposium featured 29 invited presentations and 42 lightning talks given by young researchers, covering every aspect of astroparticle physics and cosmology, from early-universe inflationary dynamics to late-universe structure formation. The event also included a plenary session dedicated to the planning of a community-wide white paper, followed by thematic parallel discussions. An award ceremony congratulated Hannah Banks from the University of Cambridge, Francesca Capel from TU Munich and Charles Dalang from the University of Geneva for the best talks by young scientists.
"The symposium has been a successful opportunity for community building and for looking into the future of astroparticle physics and cosmology," said Gian Giudice, the Head of CERN’s Theoretical Physics department. "The emphasis on the future was underlined by our choice of selecting almost all speakers from among young researchers."
EuCAPT is led by an international steering committee comprising 12 theorists from institutes in France, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom, and from CERN. Its aim is to coordinate scientific and training activities, help researchers attract adequate resources for their projects, and promote a stimulating and open environment in which young scientists can thrive. CERN will act as the central hub of EuCAPT for the first five years.
The Standard Model of particle physics is alive and well. But it is not complete, so physicists continue to search for new particles and forces that could help complete the model and also explain some tensions with the model – or “anomalies” – in the behaviour of known particles. In a paper accepted for publication in Physical Review Letters, the NA64 collaboration describes how a search for new unknown particles – lightweight “X bosons” that could carry a new force – has allowed it to set bounds on how much these particles could contribute to a fundamental property of the electron, in which an apparent anomaly has recently emerged.
The property in question is the anomalous magnetic moment. The magnetic moment of a particle is a measure of how the particle interacts with a magnetic field. The anomalous magnetic moment is the part of the magnetic moment caused by the interaction of the particle with “virtual” particles that continually pop into and out of existence. These virtual particles comprise all the known particles, predicted by the Standard Model, but they could also include particles never before observed. Therefore, a difference between the Standard Model prediction of the anomalous magnetic moment of a particle and a high-precision measurement of this property could be a sign of new physics in the form of new particles or forces.
The most striking example of such an anomaly is the muon’s anomalous magnetic moment, for which Fermilab in the US recently announced a difference with theory at a significance level of 4.2 standard deviations – just a little below the 5 standard deviations required to claim a discovery of new physics. But there is another example, although at a lower significance level: the Standard Model’s prediction of the electron’s anomalous magnetic moment, based on the measurement of the fundamental constant of nature that sets the strength of the electromagnetic force, differs from the direct experimental measurement at a level of 1.6 or 2.4 standard deviations, depending on which of two measurements of the fundamental constant is used.
Like other anomalies, this anomaly may fade away as more measurements are made or as theoretical predictions improve, but it could also be an early indication of new physics, so it is worth investigating. In its new study, the NA64 collaboration set out to investigate whether new lightweight X bosons could contribute to the electron’s anomalous magnetic moment and explain this apparent anomaly.
NA64 is a fixed-target experiment that directs an electron beam of 100-150 GeV energy, generated using a secondary beamline from the Super Proton Synchrotron, onto a target to look for new particles produced by collisions between the beam’s electrons and the target’s atomic nuclei. In the new study, the NA64 team searched for lightweight X bosons by looking for the “missing” collision energy they would carry away. This energy can be identified by analysing the energy budget of the collisions.
Analysing data collected in 2016, 2017 and 2018, which in total corresponded to about three hundred billion electrons hitting the target, the NA64 researchers were able to set bounds on the strength of the interaction of X bosons with an electron and, as a result, on the contributions of these particles to the electron’s anomalous magnetic moment. They found that X bosons with a mass below 1 GeV could contribute at most between one part in a quadrillion and one part in ten trillions, depending on the X boson’s mass.
“These contributions are too small to explain the current anomaly in the electron’s anomalous magnetic moment,” says NA64 spokesperson Sergei Gninenko. “But the fact that NA64 reached an experimental sensitivity that is better than the current accuracy of the direct measurements of the electron’s anomalous magnetic moment, and of recent high-precision measurements of the ﬁne-structure constant, is amazing. It shows that NA64 is well placed to search for new physics, and not only in the electron’s anomalous magnetic moment.”
On Thursday, 29 April 2021 at 4.30 p.m., Professor Monica Colpi (Università degli Studi di Milano-Bicocca) will give a presentation on “The Laser Interferometer Space Antenna to Explore the Invisible Universe” (more on LISA below). To attend, follow the instructions available in the Indico event.
The colloquium will be followed by a Library Talk event presenting two science popularisation books on the beauty of astrophysics and gravitational waves: Monica Colpi’s Notte Siriaca (Science Express) and Paola Catapano’s Il Lungo Viaggio delle Onde Gravitazionali (Textus). The talks will be moderated by Antonella Del Rosso, after a short introduction by Tullio Basaglia. The books being published in Italian, the Library Talk event will be in Italian only.
All colloquium attendees are invited to stay for this presentation, which will be accessible via the same Zoom link.
The Laser Interferometer Space Antenna (LISA)
LISA, a gigameter-scale space-based gravitational wave observatory, will explore the gravitational wave universe in the band from below 0.1 mHz to above 0.1 Hz. LISA will grant us access to a huge cosmological volume with unprecedented reach deep into space, detecting signals up to redshifts 20-30 and even beyond, if sources exist. LISA will detect massive black hole coalescences to unveil the as-yet-unknown origins of the first quasars and to shed light into the teeming population of middleweight black holes forming in galactic dark matter halos. LISA will discover the link between the most energetic phenomena in the universe – accreting and merging black holes – and the grand design of galaxy assembly. In synergy with third-generation ground-based interferometers, we will discover how gravitational collapse to a black hole is triggered, on all astrophysically relevant mass scales from a few tens to a few billions of solar masses. I will address how the X-ray mission Athena, which is joining LISA in concurrent multi-messenger observations of massive black hole coalescences, will greatly enhance our knowledge on the propagation properties of gravitational waves and on the rate of expansion of our universe.
Since the Higgs boson was discovered in 2012, scientists at the Large Hadron Collider (LHC) have been studying the properties of this very special particle and its relation to the fundamental mechanism essential to the generation of mass of elementary particles. One property that remains to be experimentally verified is whether the Higgs boson is able to couple to itself, known as self-coupling. Such an interaction would contribute to the production of a pair of Higgs bosons in the LHC's high-energy proton–proton collisions, an incredibly rare process in the Standard Model – more than 1000 times rarer than the production of a single Higgs boson! Measuring a Higgs boson self-coupling that is different from the predicted value would have important consequences; the universe might be able to transition into a lower energy state and the laws that govern the interactions of matter could take a very different shape.
At the ongoing Rencontres de Moriond conference, the ATLAS collaboration presented the result of a study that further explores this question. ATLAS physicists looked for the two intimately related Higgs-pair production processes that could be present in LHC collisions, though only one of these is related to the Higgs boson self-coupling and contributes favourably to the production of Higgs pairs when their total mass is low. These two processes interfere quantum mechanically and suppress Higgs boson pair production in the Standard Model. If a new physics phenomenon is at play, it could change the Higgs boson self-coupling and ATLAS might see more pairs of Higgs bosons than expected – or in particle physics parlance, measure a higher cross-section.
For their new study, ATLAS physicists have developed new analysis techniques to search for the rare process in which one of the two Higgs bosons decays to two photons and the other decays to two bottom quarks (HH → ɣɣbb). First, they divided the proton–proton collision events into low and high mass regions, so as to optimise the sensitivity to the Higgs boson self-coupling. Then, using a machine-learning algorithm, they separated the events that look like the HH → ɣɣbb process from those that don’t. Finally, they determined the cross-section for Higgs-pair production and observed how it varies as a function of the ratio of the Higgs boson self-coupling to its Standard Model value. This allowed ATLAS to constrain the Higgs boson self-coupling, between –1.5 and 6.7 times the Standard Model prediction, and also the Higgs-pair production cross-section. The result on the Higgs boson self-coupling is more than twice as powerful as the previous ATLAS result in the same Higgs-pair decay channel.
Although this result sets the world’s best constraints on the size of the Higgs boson self-coupling, the work is just beginning. This is a preview of what is to come, as much more data would be needed to observe the Higgs boson self-coupling if it were close to its Standard Model prediction. Observing the Higgs boson self-coupling is indeed one of the raisons d’être of the High-Luminosity LHC (HL-LHC) programme, an upgrade to the LHC scheduled to begin operations in the late 2020s. The HL-LHC is expected to deliver a dataset more than 20 times larger than the one used in this analysis and to operate at higher collision energy. If Higgs-pair production is as predicted by the Standard Model, it should be observed in this huge dataset, and a more quantitative statement will be made on the strength of the Higgs boson coupling to itself.
Read more on the ATLAS website.
CERN’s Antimatter Factory is the only place in the world where low-energy antiprotons – the antimatter counterparts of protons – are produced. But in the not-so-distant future it could also be the first place to dispatch trapped antiprotons to another location. On 17 March 2021, the CERN Research Board approved the development of two new experiments to carry antiprotons from the Antimatter Factory to other facilities, for antimatter and nuclear-physics studies. BASE-STEP and PUMA, as the experiments are called, are compact enough to be transported in a small truck or van.
BASE-STEP is based on the BASE experiment – a set-up of traps to store and study in detail antiprotons produced at the Antimatter Factory. Using this set-up, the BASE team measures the properties of the antiproton and compares them with those of the proton to see if there are differences between the two – if found, such differences could shed light on the imbalance between matter and antimatter in the universe. BASE has been performing ever more precise antiproton measurements, but the precision of these measurements is limited by disturbances to the set-up’s magnetic field caused by the magnetic environment of the Antimatter Factory.
BASE-STEP is a variant of the BASE set-up that has been designed to be carried to a facility at CERN or elsewhere, one that has a calmer magnetic environment and thus allows higher-precision measurements to be made. The device will feature a first trap to receive and release the antiprotons produced at the Antimatter Factory and a second trap to store the antiprotons.
PUMA is based on a different transportable antiproton trap system and has a different scientific goal. It will transport antiprotons from the Antimatter Factory to CERN’s nuclear-physics facility, ISOLDE, for investigation of exotic nuclear-physics phenomena. It will consist of a first trapping zone to stop antiprotons, and a second trapping zone to host collisions between the antiprotons and radioactive atomic nuclei that are routinely produced at ISOLDE but decay too rapidly to be transported anywhere themselves.
Analysis of the outcome of these collisions, which will be detected by a particle detector surrounding the collision zone, will help researchers determine the relative densities of protons and neutrons at the surface of nuclei. These densities could reveal whether the nuclei have exotic properties such as thick neutron “skins” or extended halos of protons or neutrons around their core. Such knowledge could shed light on the interior of neutron stars.
PUMA and BASE-STEP are expected to be operational in 2023.
Today the LHCb experiment at CERN announced new results which, if confirmed, would suggest hints of a violation of the Standard Model of particle physics. The results focus on the potential violation of lepton flavour universality and were announced at the Moriond conference on electroweak interactions and unified theories, as well as at a seminar held online at CERN, the European Organization for Nuclear Research.
The measurement made by the LHCb (Large Hadron Collider beauty) collaboration, compares two types of decays of beauty quarks. The first decay involves the electron and the second the muon, another elementary particle similar to the electron but approximately 200 times heavier. The electron and the muon, together with a third particle called the tau, are types of leptons and the difference between them is referred to as “flavours”. The Standard Model of particle physics predicts that decays involving different flavours of leptons, such as the one in the LHCb study, should occur with the same probability, a feature known as lepton flavour universality that is usually measured by the ratio between the decay probabilities. In the Standard Model of particle physics, the ratio should be very close to one.
The new result indicates hints of a deviation from one: the statistical significance of the result is 3.1 standard deviations, which implies a probability of around 0.1% that the data is compatible with the Standard Model predictions. “If a violation of lepton flavour universality were to be confirmed, it would require a new physical process, such as the existence of new fundamental particles or interactions,” says LHCb spokesperson Professor Chris Parkes from the University of Manchester and CERN. “More studies on related processes are under way using the existing LHCb data. We will be excited to see if they strengthen the intriguing hints in the current results.”
The deviation presented today is consistent with a pattern of anomalies measured in similar processes by LHCb and other experiments worldwide over the past decade. The new results determine the ratio between the decay probabilities with greater precision than previous measurements and use all the data collected by the LHCb detector so far for the first time.
The LHCb experiment is one of the four large experiments at the Large Hadron Collider at CERN, situated underground on the Franco-Swiss border near Geneva. The experiment is designed to study decays of particles containing a beauty quark, a fundamental particle that has roughly four times the mass of the proton. The results presented today focus on lepton flavour universality, but the LHCb experiment also studies matter-antimatter differences.
Looking towards the future, the LHCb experiment is well placed to clarify the potential existence of new physics effects hinted at in the decays presented today. The LHCb experiment is expected to start collecting new data next year following an upgrade to the detector.
Photo of the LHCb experiment : http://cds.cern.ch/record/2302374?ln=fr#24
Caption: “The LHCb experiment is one of the four large experiments at the Large Hadron Collider at CERN, situated underground on the Franco-Swiss border near Geneva.”
LHCb paper : https://arxiv.org/abs/2103.11769
LHCb article : https://lhcb-public.web.cern.ch/Welcome.html#RK2021
The more results it delivers, the more surprises it reveals. That pretty much sums up the outcome so far of the AMS experiment – a space-based detector that was assembled at CERN and has been detecting electrically charged particles from outer space, known as cosmic rays, since 2011. And, surprise, surprise, the latest result from the experiment, described in a paper published in Physical Review Letters, is no exception. The new result shows that the properties of iron nuclei – the most abundant primary cosmic rays beyond silicon nuclei and the heaviest cosmic rays measured by AMS until now – are surprisingly different from those of other heavy primary cosmic rays.
Historically, cosmic rays are classified into two classes, primaries and secondaries. Primary cosmic rays are produced in supernovae explosions in the Milky Way and beyond, whereas secondary cosmic rays are produced by interactions between the primary cosmic rays and the interstellar medium. But an AMS study from last year revealed that, contrary to expectations, primary cosmic rays have at least two distinct classes, one made of light nuclei and another made of heavy nuclei. And now the new AMS study shows that iron nuclei, which are much heavier than any other nuclei measured by AMS so far, belong unexpectedly not to the same class as the other heavy nuclei but instead to the class of light nuclei.
The AMS team arrived at this conclusion using AMS data on the number, or more accurately the flux, of iron nuclei and how this flux varies with rigidity – a measure of a charged particle’s momentum in a magnetic field. Analysing the data in the rigidity range from 2.65 GV to 3.0 TV, the team found that, above a rigidity of 80.5 GV, the rigidity dependence of the flux of iron cosmic rays is identical to the rigidity dependence of the fluxes of the light primary helium, carbon and oxygen cosmic rays, which is different from the rigidity dependence of the fluxes of the heavy primary neon, magnesium and silicon cosmic rays.
“Our results are mind-bending, defying again conventional models of cosmic-ray origin and propagation in the interstellar medium,” says AMS-experiment spokesperson Samuel Ting. “It will no doubt be interesting to see what theorists and modellers make of them.”
The TOTEM collaboration at the LHC, together with the DØ collaboration at the Tevatron collider at Fermilab, have announced the discovery of the odderon – an elusive state of three fundamental particles called gluons that was predicted almost 50 years ago. The result was presented on Friday 5 March during a meeting at CERN, and follows the joint submission in December 2020 of a CERN/Fermilab preprint by TOTEM and DØ reporting the observation.
“This result probes the deepest features of the theory of quantum chromodynamics, notably that gluons interact between themselves and that an odd number of gluons are able to be “colourless”, thus shielding the strong interaction,” says TOTEM spokesperson Simone Giani of CERN. “A notable feature of this work is that the results are produced by combining the LHC and Tevatron data at different energies.”
States comprising two, three or more gluons are usually called “glueballs”, and are peculiar objects made only of the carriers of the strong force. The advent of quantum chromodynamics (QCD) led theorists to predict the existence of the odderon in 1973. Proving its existence has been a major experimental challenge, however, requiring detailed measurements of protons as they glance off one another in high-energy collisions.
While most high-energy collisions cause protons to break into their constituent quarks and gluons, roughly 25% are elastic collisions where the protons remain intact but emerge on slightly different paths (deviating by around a millimetre over a distance of 200 m at the LHC). TOTEM measures these small deviations in proton–proton scattering using two detectors located on either side of the CMS experiment 220 m from the interaction point , while DØ employed a similar setup at the Tevatron proton–antiproton collider.
At lower energies, differences in proton–proton vs proton–antiproton scattering are due to the exchange of different virtual mesons – particles made up of a quark and an antiquark. At multi-TeV energies, on the other hand, proton interactions are expected to be mediated purely by gluons. In particular, elastic scattering at low-momentum transfer and high energies has long been explained by the exchange of a pomeron – a “colour-neutral” virtual glueball made up of an even number of gluons.
However, in 2018, TOTEM reported measurements at high energies that could not easily be explained by this traditional idea. Instead, a further QCD object seemed to be at play, supporting models in which a three-gluon compound, or one containing higher odd numbers of gluons, was being exchanged. The results were sufficient to claim evidence for the odderon, although not yet its definitive observation.
The new work is based on a model-independent analysis of data at medium-range momentum transfer. The TOTEM and DØ teams compared LHC proton–proton data (recorded at collision energies of 2.76, 7, 8 and 13 TeV and extrapolated to 1.96 TeV), with Tevatron proton–antiproton data measured at 1.96 TeV, and found evidence again for the odderon. When the teams combined the result with measurements at much smaller scattering angles at 13 TeV by the TOTEM collaboration, the significance of the result was boosted to the discovery level.
“When combined with the measurements at 13 TeV, the significance of the result is in the range of 5.2–5.7 standard deviations and thus constitutes the first experimental observation of the odderon,” said Christophe Royon of the University of Kansas, who presented the results on behalf of DØ and TOTEM last week. “This is a major discovery by CERN and Fermilab.”
In addition to the new TOTEM-DØ model-independent study, several theoretical papers based on data from the Intersecting Storage Rings, the Super Proton Synchrotron, the Tevatron and the LHC, and on model-dependent inputs, provide additional evidence supporting the conclusion that the odderon exists.
This update is a modified version of a story originally published in the CERN Courier.
Protons are one of the main building blocks of the visible universe. Together with neutrons, they make up the nuclei of every atom. Yet, several questions loom about some of the proton’s most fundamental properties, such as its size, internal structure and intrinsic spin. In December 2020, the CERN Research Board approved the first phase (“phase-1”) of a new experiment that will help settle some of these questions. AMBER, or Apparatus for Meson and Baryon Experimental Research, will be the next-generation successor of the Laboratory’s COMPASS experiment.
COMPASS receives particle beams from CERN’s Super Proton Synchrotron and directs them onto various targets to study how quarks and gluons form hadrons (such as protons, pions and kaons) and give these composite particles their distinctive properties. Using this approach, COMPASS has obtained many important results, including several results linked to the proton’s spin structure and a measurement of the pion’s polarisability; the polarisability of a hadron is the degree to which its constituent positive and negative electric charges can be separated in an electric field.
AMBER will build on COMPASS’s legacy and take it to the next level. By upgrading existing COMPASS components and introducing new detectors and targets, as well as using state-of-the-art read-out technology, the team behind AMBER plans to take three kinds of measurements in the experiment’s first phase.
First, by sending muons, heavier cousins of the electron, onto a hydrogen target, the AMBER team plans to determine with high precision the proton’s charge radius – the extent of the spatial distribution of the particle’s electric charge. This measurement would help resolve the proton radius puzzle, which emerged in 2010 when a new measurement of the proton radius was found to be substantially different from the previously accepted measurements.
Second, by directing protons onto proton and helium-4 targets, AMBER will determine the little-known production rate of antiprotons, the antimatter counterparts of protons, in these collisions. These measurements will improve the accuracy of predictions of the ﬂux of antiprotons in cosmic rays, which are needed to interpret data from experiments searching for evidence of dark matter in the flux of antiproton cosmic rays.
Third, by focusing pions on nuclear targets, AMBER will measure the momentum distributions of the quarks and gluons that form the pion. These measurements will cast light on the particle dynamics that holds the pion together and ultimately on the origin of the masses of hadrons, which is known technically as the emergence of hadron mass.
Further insights into the emergence of hadron mass are anticipated from studies of the internal structure of kaons in the second phase (“phase-2”) of AMBER. These studies require the beamline that feeds COMPASS to be upgraded to deliver a charged-kaon beam of high energy and intensity.
Combining AMBER’s pion and kaon results will lead to a better understanding of the interplay between nature’s two mass-generating mechanisms: the mechanism that gives hadrons their masses and the Higgs mechanism, which endows massive elementary particles with mass.
AMBER is expected to start taking data in 2022, after the completion of the last run of COMPASS in 2021–2022.
Read more about COMPASS and AMBER in this Experimental Physics newsletter article.
How many new particles has the LHC discovered? The most widely known discovery is of course that of the Higgs boson. Less well known is the fact that, over the past 10 years, the LHC experiments have also found more than 50 new particles called hadrons. Coincidentally, the number 50 appears in the context of hadrons twice, as 2021 marks the 50th anniversary of hadron colliders: on 27 January 1971, two beams of protons collided for the first time in CERN’s Intersecting Storage Rings accelerator, making it the first accelerator in history to produce collisions between two counter-rotating beams of hadrons.
So what are these new hadrons, which number 59 in total? Let’s start at the beginning: hadrons are not elementary particles – physicists have known that since 1964, when Murray Gell-Mann and George Zweig independently proposed what is known today as the quark model. This model established hadrons as composite particles made out of new types of elementary particles named quarks. But, in the same way as researchers are still discovering new isotopes more than 150 years after Dmitri Mendeleev established the periodic table, studies of possible composite states formed by quarks are still an active field in particle physics.
The reason for this lies with quantum chromodynamics, or QCD, the theory describing the strong interaction that holds quarks together inside hadrons. This interaction has several curious features, including the fact that the strength of the interaction does not diminish with distance, leading to a property called colour confinement, which forbids the existence of free quarks outside of hadrons. These features make this theory mathematically very challenging; in fact, colour confinement itself has not been proven analytically to this date. And we still have no way to predict exactly which combinations of quarks can form hadrons.
What do we know about hadrons then? Back in the 1960s, there were already more than 100 known varieties of hadrons, which were discovered in accelerator and cosmic-ray experiments. The quark model allowed physicists to describe the whole “zoo” as different composite states of just three different quarks: up, down and strange. All known hadrons could be described as either consisting of three quarks (forming baryons) or as quark–antiquark pairs (forming mesons). But the theory also predicted other possible quark arrangements. Already in Gell-Mann’s original 1964 paper on quarks, the notion of particles containing more than three quarks appeared as a possibility. Today we know that such particles do exist, but it took several decades to confirm in experiments the first four-quark and five-quark hadrons, or tetraquarks and pentaquarks.
A full list of the 59 new hadrons found at the LHC is shown in the image below. Of these particles, some are pentaquarks, some are tetraquarks and some are new higher-energy (excited) states of baryons and mesons. The discovery of these new particles, together with measurements of their properties, continues to provide important information for testing the limits of the quark model. This in turn enables researchers to further their understanding of the strong interaction, to verify theoretical predictions and to tune models. This is especially important for the research done at the Large Hadron Collider, since the strong interaction is responsible for the vast majority of what happens when hadrons collide. The better we can understand the strong interaction, the more precisely we can model these collisions and the better are our chances of seeing small deviations from expectations that could hint at possible new physics phenomena.
The hadron discoveries from the LHC experiments keep coming, mainly from LHCb, which is particularly suited to studying particles containing heavy quarks. The first hadron discovered at the LHC, χb(3P), was discovered by ATLAS, and the most recent ones include a new excited beauty strange baryon observed by CMS and four tetraquarks detected by LHCb.The full list of new hadrons found at the LHC, organised by year of discovery (horizontal axis) and particle mass (vertical axis). The colours and shapes denote the quark content of these states. (Image: LHCb/CERN)
Read also this article in the CERN Courier.
By: Achintya Rao
5 MARCH, 2021
In the final part of the LHC Physics at Ten series, we look at the searches that go beyond our current understanding of the universeAn event recorded during 2016 with the CMS detector that contains 10 jets (orange cones) and a muon (red line). (Image: CERN)
Count all the known kinds of particles in the universe. Now double it. This is the promise of a family of theoretical models known as Supersymmetry, or SUSY for short.
The notion of theories predicting a doubling of observed particles may not be as bizarre as it seems. In fact, it has historical precedent with the story of antimatter.
“The first hints of antimatter came from Paul Dirac trying to solve problems in relativistic quantum mechanics,” says Laura Jeanty, who co-leads the Supersymmetry (SUSY) group on the ATLAS experiment at the Large Hadron Collider. “He came up with equations that essentially had four solutions instead of two, and the symmetries of the maths allow positive as well as negative values.” In 1928, Dirac concluded that if the negative values represented electrons, the positive values must represent an equivalent positively charged particle. The positron, or antielectron, was eventually discovered by Carl Anderson in 1932.
“At the time of Dirac’s theoretical work, however,” Jeanty adds, “it was a mathematical quirk that didn’t have any known physical reality.” Today, we have discovered antiparticles for all the charged particles in the “Standard Model” of particle physics – the best description we have of our universe at the quantum scale. The Standard Model, however, has important limitations, and SUSY provides a theoretical extension to it by introducing new mathematical symmetries.Inexplicable hierarchies A CMS event display from 2016 containing 10 jets (orange cones) and a muon (red line), which is representative for the signatures that certain supersymmetry models would leave in the CMS detector. (Image: CMS/CERN)
As a scientific theory, the Standard Model is incredibly robust. Frustratingly so for physicists, because they are aware that this theory does not explain everything about the infinitesimal world of particles and quantum forces. Nevertheless, experimentalists have found no chinks in its armour, no deviations from its very accurate predictions, despite its limitations.
One such limitation is that the Standard Model accounts for only three of the known quantum forces in the universe: the strong, electromagnetic and weak forces; gravity is not in the mix. A symptom of this is known as the hierarchy problem, pertaining to the vast difference between the strengths of the strong, electromagnetic and weak forces on one hand and gravity on the other. Despite its name, the weak force is around 24 orders of magnitude (1024) stronger than gravity. But why does this matter?
“The hierarchy problem,” remarks Pieter Everaerts, Laura’s counterpart on the CMS experiment, “tells us that there have to be corrections to our current knowledge of physics.” The problem affects, for example, the mass of the Higgs boson that was discovered by ATLAS and CMS in 2012. According to quantum mechanics, the Higgs boson should have a mass several orders of magnitude higher than what was observed, because of its interactions with ephemeral virtual particles that pop in and out of existence.
SUSY provides an elegant theoretical solution to this problem. It does this by proposing the following: fermions – particles that make up matter – have force-carrying super-partners known as “sfermions”, while bosons – force-carriers in the Standard Model – are paired up with fermionic “bosinos”.
“With SUSY, the Higgs boson has twice as many particles to interact with,” adds Everaerts. Neatly, this allows the excess values of its expected mass coming from its interactions with ordinary particles to cancel out with the values of its interactions with supersymmetric particles. You are left with a predicted mass for the Higgs boson that is close to the observed mass of 125 GeV.
All that remains is the one small detail of finding at least one of the predicted SUSY particles.Optimism at a new frontier
Before the LHC began colliding protons together, there was a buzz of expectation among experimental and theoretical particle physicists.“It seems strange to say this now,” continues Everaerts, “but when the LHC began colliding protons in 2010, some were expecting us to discover six or seven SUSY particles immediately.” The density of allowed supersymmetric models before and after ATLAS had searched through Run-1 data (data gathered up to February 2013)(Image: ATLAS/CERN)
There was even concern that too many SUSY particles (or “sparticles”) at a low enough mass would add to the “background”, or noise, and make it harder to study Standard Model processes at the LHC. This optimism had to quickly face reality when no sparticles manifested. Indeed, no deviations from the Standard Model have been observed in the nearly 15 million billion (15 000 000 000 000 000) proton–proton collisions that have taken place inside each of ATLAS and CMS.
Over the years, data collected by ATLAS and CMS enabled the collaborations to discard several of the simpler SUSY models, ruling out sparticles with masses up to around a teraelectronvolt (or TeV). All this showed, though, was that the most rudimentary interpretations of the theories were inadequate. “When we rule things out to a certain energy, we are aware that these are not realistic models, they are benchmarks,” says Federico Meloni of ATLAS. “And when you look at the same data through a less simplified interpretation, what we call an analysis in multidimensional parameter space,” he continues, “a limit of 2 TeV can become 500 GeV [gigaelectronvolt] or maybe we don’t have a limit at all. When looking at the big picture, it is only after ten years of operations that we are starting to be able to make interesting statements about the key issues.”
For the moment, in the absence of a discovery, SUSY remains firmly in the realm of theory alone. “Supersymmetry in the way we were thinking has been ruled out and we have to now look for it in a different way,” says Gian Giudice, head of CERN’s Theoretical Physics department. “We continue to advance techniques to search for supersymmetric particles,” Jeanty adds. “More LHC data will help us to look further into the challenging corners of phase space, where new physics could still be lurking.”
The LHC however is after more than just SUSY. Indeed, extensions of the Standard Model come in many forms, and on ATLAS and CMS the several teams performing searches for physics beyond the Standard Model are grouped together under the name “Exotics”. (The name “Miscellaneous” is quite obviously less exciting.) Some of these searches seem to come right out of science fiction…Extra dimensions and micro black holes
Elementary particle physics concerns itself with the very small: tiny particles interacting through quantum forces at an unimaginably minuscule scale. Gravity, on the other hand, applies to the very large – think planets, stars and galaxies – and has sat apart from the quantum domain in our understanding of the universe. A quantum theory of gravity has remained the holy grail of high-energy physics for decades.“If there is a quantum description of gravity, is there a particle that is responsible for mediating gravity?” asks Carl Gwilliam, a former coordinator of ATLAS exotic physics.
Discovery of such a particle, known as a graviton, would help settle the debate on the many theoretical models that attempt to unify gravity with the other three forces.A multi-jet event display observed by the CMS detector in 2015 in the search for microscopic black holes. (Image: CMS/CERN)
ATLAS and CMS are searching for gravitons directly, as for any other new particle, by looking for a bump in a smoothly falling distribution in the data. But, because the theories that predict the existence of a graviton also predict the existence of more than four dimensions of spacetime, the physicists are also looking for particles that are produced in collisions before they disappear into the extra dimensions. You cannot detect these disappearing particles directly; indeed, you cannot even ask the detectors to take snapshots of such collision events because there is nothing to “trigger” the detectors. You can, however, infer their presence by detecting an accompanying jet of particles produced from the same collision and observing simultaneously a lot of missing energy from the interaction itself. Niki Saoulidou, who co-leads the CMS Exotics team, points out that before the LHC was switched on, it was thought that these kinds of mono-jet searches, which also look for dark matter or supersymmetric particles, were too challenging for hadron-collider environments. “But we have evolved our tools, our techniques, our detector and our physics understanding so much that we now consider these as standard searches,” she says.
Another way of detecting extra dimensions made headlines before the LHC began operations: micro black holes. If produced in high-energy proton–proton collisions at the LHC, these tiny black holes would instantly evaporate, leaving behind multiple jets of particles. “The thing with black-hole searches is that they would be very spectacular!” remarks Gwilliam. “You would expect to see a black-hole event very early on in a new energy regime.” Since it began operations, the LHC has explored three new high-energy regimes: 7 TeV, 8 TeV and 13 TeV. ATLAS and CMS also searched at the lower energy of 900 GeV. “Unfortunately, these have not showed up in the data,” adds Gwilliam, “so we have set very strong limits on their existence.”
What does this mean for unifying gravity with the quantum forces? Saoulidou is philosophical: “It could be that we don’t have a quantum theory of gravity for a very good reason, which nature knows but we don’t.”
Nevertheless, those 15 million billion collisions that ATLAS and CMS have analysed make up only 5% of the total data volume the LHC will deliver over the course of its lifetime. The graviton could still be out there.Mysterious missing pieces and uncanny coincidences
“Results from searches for exotic physics were at the forefront of work done at the start of the LHC era,” says Adish Vartak, former co-leader of the CMS Exotics team. There was a lot of potential for finding new physics, given that the LHC was operating at an energy of around four times higher than the previous highest-energy collider, the Tevatron at Fermilab.“When the LHC started,” Vartak continues, “we wanted to see whether there was a new resonance – a new particle – at a few TeV or so, at energies that the Tevatron could not probe.”
It is not only the spectacular that particle physicists are after. Many of the searches conducted by the Exotics groups of ATLAS and CMS look for answers to particularly puzzling questions. For example, the data have so far shown that quarks are elementary particles; that is, they are themselves not made up of any particles. But we don’t know for sure if that is the case. Finding quarks in excited states at high energies would demonstrate that they have inner substructure. Another puzzle is that the two families of fermions – leptons and quarks – curiously come in three generations each. There is no particular reason for this, unless they are somehow related to one another. ATLAS and CMS are therefore looking for leptoquarks, particles predicted to be hybrids of both kinds of fermions.
Physicists are also looking for previously unobserved quantum forces, which would manifest in the form of heavier versions of the electroweak-force-carrying W and Z bosons, called W′ (“W-prime”) and Z′ (“Z-prime”). In the case of neutrinos, the reason for their extremely light but non-zero mass could be explained by discovering heavier exotic neutrinos, which balance the lighter regular neutrinos through a “see-saw” mechanism.
Other searches are for heavier Higgs bosons, charged Higgs bosons and even composite (non-elementary) Higgs bosons. Yet more focus on hypothesised magnetic monopoles (a single north or south pole) that, rather than bending in the high magnetic fields of ATLAS and CMS, would get accelerated through them.
Of course, experimentalists are also looking for any new particles and phenomena, even ones not explicitly predicted by theory. Giudice, a theorist, adds: “Experimentalists can make progress without a theorist telling them, ‘Oh, this comes from this model.’ Before the LHC, much of the analysis was done in terms of models. Now they try to present the data without relying on a specific model but rather on a broader language.”The 750-GeV bump-that-was-not Graph showing the sharp rise in arXiv paper submissions after December 2015, as theorists attempted to explain the 750-GeV bump in the data (Image: André David)
This model-independent approach caused much excitement in 2015. Over the course of the first year of the LHC’s second run, both ATLAS and CMS began to notice something peculiar in their data. There appeared to be a slight excess of events in the two-photon channel at a mass of 750 GeV/c² in both their data sets. Initially the excess was of very low statistical significance, far from the 5-sigma threshold for claiming a discovery. Nevertheless it intrigued experimentalists and theorists alike. “As data began to be collected,” Vartak recounts, “there was a lot of hope that a new kind of physics – one that had evaded us previously – would show up.”
In December, at the annual end-of-year seminar from the LHC experiments, excitement reached fever pitch. ATLAS and CMS presented data showing the significance of the excess was around 3 sigma. In the following three months, around 300 papers were submitted to arXiv by theorists seeking to explain the inexplicable. By the time all of the data from Run 2 (2015–2018) were studied, the excess had evaporated. There is a reason physicists wait until the 5-sigma threshold is breached: smaller excesses are not unusual and are usually statistical fluctuations and low-significance flukes.
For the Exotics teams, though, it was the closest they came to something from beyond the Standard Model.Changes in strategy and the road ahead
The lessons learnt so far are helping shape the search strategies of the future. For one, the triggers that select the collision data worth storing for further analysis are being recalibrated to handle so-called “long-lived particles”, which might transform into lighter particles outside of the typical timeframe when the triggers take their snapshots of collisions. Efforts are also underway to reanalyse the data recorded so far using novel techniques.
The challenges provide more than adequate motivation and inspiration. “I’ve always had a lot of fun with my research,” Everaerts says with a smile. “Collaborating with people from different backgrounds to work on a common goal: I find that amazing!”
So what is the legacy of the LHC after its first decade of operation? Giudice is emphatic: “The LHC has changed radically the way we view the world of particle physics today.” It might not have shown what theorists hoped it would, but it has helped make important strides in both theory and experiment. “When I start with an idea and then it turns out to be wrong, it’s not a question of failure; it is the scientific method,” Giudice continues. “You make a hypothesis, you check it with experiment, if it is right you keep on going but if it is wrong, you explore a different hypothesis.”
“As experimentalists,” Meloni adds, “when we search beyond the Standard Model, our job is to look for everything. We know that we look for something, we look for it everywhere, and chances are it’s not going to be there. Still, our job is to understand our measurement, our search, and get to the result. And then the results are up for interpretation.”
After all, looking and not finding is not the same as not looking.In the final part of the “LHC Physics at Ten” series, we look at the searches that go beyond our current understanding of the universe
The goal of the 28th International Workshop on Weak Interactions and Neutrinos (WIN 2021) is to offer to the community a significant opportunity to assess the status of the field and to initiate collaborative efforts to address current physics questions. Following up on previous successful workshops, most recently in Heidelberg (2015), Irvine (2017) and Bari (2019), WIN 2021 will be held Monday June 7, 2021 to Saturday June 12, 2021. Due to international travel uncertainties, WIN 2021 will be entirely online with sessions between 13:00 and 17:00 GMT.
Required no cost registration and additional information about the workshop are now available at the WIN 2021 website (http://win2021.umn.edu).
The topics for WIN 2021 are:
The names and contact information for theoretical and experimental conveners for each of these four topics are listed on the WIN 2021 website.
Participation opportunities for WIN 2021 include:
Participants interested in proposing contributions should submit abstracts using the workshop Indico link: https://indico.fnal.gov/event/win2021/.
Please direct questions to the Organizing Committee at: email@example.com.