Subscribe to RSS - Physics feed
Updated: 21 min 37 sec ago

Searching for matter–antimatter asymmetry with the Higgs boson

Wed, 22/06/2022 - 17:47
Searching for matter–antimatter asymmetry with the Higgs boson

Symmetries make the world go round, but so do asymmetries. A case in point is an asymmetry known as charge–parity (CP) asymmetry, which is required to explain why matter vastly outnumbers antimatter in the present-day universe even though both forms of matter should have been created in equal amounts in the Big Bang.

The Standard Model of particle physics – the theory that best describes the building blocks of matter and their interactions – includes sources of CP asymmetry, and some of these sources have been confirmed in experiments. However, these Standard Model sources collectively generate an amount of CP asymmetry that is far too small to account for the matter–antimatter imbalance in the universe, prompting physicists to look for new sources of CP asymmetry.

In two recent independent investigations, the international ATLAS and CMS collaborations at the Large Hadron Collider (LHC) turned to the Higgs boson that they discovered ten years ago to see if this unique particle hides a new, unknown source of CP asymmetry.

The ATLAS and CMS teams had previously searched for – and found no signs of – CP asymmetry in the interactions of the Higgs boson with other bosons as well as with the heaviest known fundamental particle, the top quark. In their latest studies, ATLAS and CMS searched for this asymmetry in the interaction between the Higgs boson and the tau lepton, a heavier version of the electron.

To search for this asymmetry, ATLAS and CMS first looked for Higgs bosons transforming, or “decaying”, into pairs of tau leptons in proton–proton collision data recorded by the experiments during the second run of the LHC (2015–2018). They then analysed this decay’s motion, or “kinematics”, which depends on an angle, called the mixing angle, that quantifies the amount of CP asymmetry in the interaction between the Higgs boson and the tau lepton.

In the Standard Model, the mixing angle is zero and thus the interaction is CP symmetric, meaning that it remains the same under a transformation that swaps a particle with the mirror image of its antiparticle. In theories that extend the Standard Model, however, the angle may deviate from zero and the interaction may be partially or fully CP asymmetric depending on the angle; an angle of -90 or +90 degrees corresponds to a fully CP-asymmetric interaction, whereas any angle in between, except 0 degrees, corresponds to a partially CP-asymmetric interaction.

After analysing their samples of Higgs boson decays into tau leptons, the ATLAS team obtained a mixing angle of 9 ± 16 degrees and the CMS team −1 ± 19 degrees, both of which exclude a fully CP-asymmetric Higgs boson­–tau lepton interaction with a statistical significance of about three standard deviations.

The results are consistent with the Standard Model within the present measurement precision. More data will allow researchers to either confirm this conclusion or spot CP asymmetry in the Higgs boson–tau lepton interaction, which would have a profound impact on our understanding of the history of the universe.

With the third run of the LHC set to start soon, the ATLAS and CMS collaborations won’t need to wait too long before they can feed more data into their analysis kits to find out whether or not the Higgs boson hides a new source of CP asymmetry.

abelchio Wed, 06/22/2022 - 16:47 Byline Ana Lopes Publication Date Thu, 06/23/2022 - 12:00

CMS on the lookout for new physics

Fri, 17/06/2022 - 15:18
CMS on the lookout for new physics

With Run 3 of the LHC just around the corner, the LHC experiments are still publishing new results based on the previous runs’ data. Despite no new discoveries being announced, small deviations from expectations are appearing in a small number of analyses. At the current level these deviations can still be attributed to random fluctuations in data, but they indicate regions that need to be investigated closely once the new stream of collisions arrives. Below are a few examples published recently by the CMS collaboration.

In 2017 CMS recorded a spectacular collision event containing four particle jets in the final state. The invariant mass of all four jets was 8 TeV and the jets could be divided into two pairs with a 1.9 TeV invariant mass each. Such a configuration could be produced if a new particle with an 8 TeV mass was created in the collision of proton beams, and subsequently decayed into a pair of – again, new – particles, with masses of 1.9 TeV. In a new analysis recently published by CMS, a search for such twin pairs of jets with matching invariant masses is performed for data collected up to the end of LHC Run 2. Surprisingly, a second event with similarly striking properties was found, with a 4-jet mass of 8.6 TeV and 2-jet masses of 2.15 TeV. These two events can be clearly seen in the plot below, where the 4-jet events are plotted as a function of the 2-jet and 4-jet mass.

Number of events observed (colour scale), plotted as a function of four-jet mass and the average mass of the two dijets. The two points in the top right correspond to the two interesting events. (Image: CMS)

While nearly all observed events with two pairs of jets are produced by strong interactions between the colliding photons, events with such high invariant masses are extremely unlikely. The probability of seeing two events at these masses without any new phenomena being present is of the order of 1 in 20 000, corresponding to a local significance of 3.9σ. While this may appear to be a very strong signal at first, given that the area of masses that are being analysed is large it is important to also look at global significance, which indicates the probability of observing an excess anywhere in the analysed region. For the two events the global significance is only 1.6σ.

Two other searches for new heavy particles are reporting small excesses in data. In a search for high mass resonances decaying into a pair of W bosons (that then decay into leptons) the highest deviation corresponds to a signal hypothesis with a mass of 650 GeV, with local significance at 3.8σ and global significance of 2.6σ. In a search for heavy particles decaying into a pair of bosons (WW, WZ or other combinations, also including Higgs bosons) that subsequently decay into pairs of jets, the data diverge from expectations in two places. The signal hypothesis is a W’ boson with a mass of 2.1 or 2.9 TeV, decaying into a WZ pair and the highest local significance is 3.6σ, with a global significance of 2.3σ.

Another new result comes from searches looking for extra Higgs boson particles decaying into tau pairs. For a new particle with a 100 GeV mass there is a small excess seen in the data with 3.1σ local and 2.7σ global significance. Interestingly, this coincides with a similar excess seen by CMS in a previous search for low-mass resonances in the two-photon final state. Another excess is visible in the high-mass range, with the largest deviation from the expectation observed for a mass of 1.2 TeV with a local (global) significance of 2.8σ (2.4σ).

The tau pair final state was also used to look for hypothetical new particles called leptoquarks. This is of particular interest since leptoquarks could potentially explain the flavour anomalies that have been observed by the LHCb experiment, so if the anomalies are indeed a manifestation of some new phenomena, this would be a way to independently look at these phenomena from a different angle. No excess has been found by CMS so far, but the analysis is only just beginning to be sensitive to the range of leptoquark parameters that could fit the flavour anomalies, so more data is needed to fully explore the leptoquark hypothesis.

The new LHC data-taking period is set to start in July, at higher energy and with significantly upgraded detectors, promising a fresh stream of data for searches for new phenomena.

Read more in the CERN Courier and CMS publications here, here and here

ptraczyk Fri, 06/17/2022 - 14:18 Byline Piotr Traczyk Publication Date Fri, 06/17/2022 - 14:13

Higgs10: The dramatic last year of CERN’s flagship LEP collider

Wed, 08/06/2022 - 17:25
The LHC will be built inside the same tunnel as an existing accelerator, the Large Electron Positron (LEP) collider which came on stream in 1989. LEP will be removed from the tunnel at the end of this year to make way for the LHC. Here technicians make delicate adjustments to one of LEP’s thousands of magnets. (Image: CERN) Higgs10: The dramatic last year of CERN’s flagship LEP collider

By: Luciano Maiani& Roger Cashmore

31 May, 2022 · Voir en français

The year 2000 was set to be the last year of running for CERN’s Large Electron–Positron (LEP) collider, and it ended in dramatic fashion. Luciano Maiani was Director-General and Roger Cashmore Research Director as the new millennium dawned.

Luciano Maiani (left) and Lyn Evans look from the LHC transfer tunnel, TI2, into the LEP/LHC tunnel just after the tunneling machine broke through on 15 May 2001. The decision to close LEP in 2000 allowed LHC works to proceed at full pace. (Image: CERN)

Roger Cashmore :

The final year of LEP operation, 2000, had been agreed on at CERN by all of the relevant committees. By this time, the LEP experiments – ALEPH, DELPHI, L3 and OPAL – had established the Standard Model of particle physics with great precision. LEP had achieved its mission, and the only thing missing from the Standard Model was the elusive Higgs particle. Nobody knew whether the Higgs was within LEP’s reach, but detailed analysis suggested that its mass might be not much more than 100 GeV and that it would be produced in electron–positron collisions in association with a Z particle. In other words, the LEP experiments might have a chance of crowning their achievements with a spectacular discovery to start the new millennium.

There was nothing to lose and, as the 2000 run got underway, the machine was pushed to its limits. A cut-off date of 1 September had been set, and a closing celebration planned for the following month. Throughout the year, regular reports were made to the LEP Experiments Committee (LEPC), but there was no sign of a Higgs up to a mass of about 110 GeV. The decision was taken to push the beam energy beyond the limits through July and August: at this stage, if something broke, it really didn’t matter. And that was when the situation became exciting. A small excess of events was observed by the ALEPH experiment at a mass of about 114 GeV, but with no supporting evidence from the other experiments. Nevertheless, I telephoned Luciano to keep him informed that we might have an exciting time on our hands, and potentially a very difficult one! As a result of the ALEPH candidates, LEP’s final run was extended through to the end of October.

Luciano Maiani : 

I remember Roger’s call like it was yesterday. Whatever happened next was going to require some difficult decisions. In October, we celebrated the conclusion of the LEP programme in the presence of eminent representatives of the Member States, even though the machine was still running. ALEPH’s excess was still there so, after the speeches were done, we discreetly started to work out the cost of running LEP for another year, and the repercussions it would have on the construction of the LHC.

The problem was that LHC excavations would soon reach the LEP tunnel, so an extra year of running would mean that work would have to stop, contracts be terminated and penalties paid to the companies involved, not to mention the extra running costs that had not been budgeted for. In total, we worked out that it would cost some 120 MCHF, and deal a major psychological blow to the LHC community. We had no way of anticipating how the LHC experiments’ funding agencies would react to the news of a year’s delay.

As October progressed, the other LEP experiments did not see anything, and ALEPH did not find any more candidates. LEP’s illustrious career seemed to be coming to an uneventful end, but there was to be one final twist: towards the end of month, the L3 experiment announced an event that seemed to change everything. It was a two-jet event. Each jet contained a b quark, and there was missing energy corresponding to the mass of a Z particle. Significantly, the jets had the fateful energy of around 114 GeV.

Michel Spiro (left) and Roger Cashmore speaking at the LEP Fest, a celebration of the achievements of LEP on 10 October 2000. (Image: CERN)

L3’s event could be interpreted as the production of the same particle that ALEPH seemed to see decaying into a b-anti-b quark pair, with the accompanying Z decaying into two invisible neutrinos. In short, it could be another trace of the existence of the Higgs boson.

We discussed the L3 event thoroughly with LEPC Chair Michel Spiro and concluded that it was inconclusive. It could be a Higgs, but it could equally well be something much more mundane: there was no imbalance in transverse energy as there had been in the 1980s when Carlo Rubbia had announced the discovery of the Z boson. Without that, the missing energy could have been lost down the beam pipes and so gone undetected and, importantly, there were well-known electromagnetic processes that would produce just such an outcome.

The L3 event was not a smoking gun after all, and we were left at the end of the month with a very difficult decision to take. Whatever we decided, some part of the community would be disappointed. Events proceeded quickly. On 3 November, LEPC delivered its verdict: not conclusive. Similar verdicts were then delivered by the Research Board and the Scientific Policy Committee (SPC). The decision was left to us and, along with Roger and the whole Directorate, we made our decision. For us, LEP was over; the LHC was the best machine to tell us whether there was a Higgs at 114 GeV, or whether LEP had been chasing phantoms.

By 4 November I had already written to George Kalmus, the Chair of the SPC. “The idea that we may find ourselves in September 2001 with 3.5–4 sigma, CERN’s financial position aggravated, LHC delayed and LHC people disbanded is not very encouraging. I am not going to go this way.” On 17 November, we recommended no additional year of LEP running to the Committee of Council. Faced with the alternative of betting 120 MCHF on the roulette wheel of a few anomalous events, the Council wisely accepted our advice.

LEP’s final year had been an emotionally charged rollercoaster ride. The lights never went out at CERN as analyses were refined around the clock and, when our decision became known, it was greeted with relief, shock and disbelief in equal measure. At the end of 2000, the Council’s decision moved us firmly into the LHC era, ready to fully explore the Higgs and much more.

 

The year 2000 was set to be the last year of running for CERN’s Large Electron–Positron (LEP) collider, and it ended dramatically

Higgs10: The dramatic last year of CERN’s flagship LEP collider

Tue, 31/05/2022 - 15:28
Higgs10: The dramatic last year of CERN’s flagship LEP collider

The year 2000 was set to be the last year of running for CERN’s Large Electron–Positron (LEP) collider, and it ended in dramatic fashion. Luciano Maiani was Director-General and Roger Cashmore Research Director as the new millennium dawned.

Roger Cashmore :

The final year of LEP operation, 2000, had been agreed on at CERN by all of the relevant committees. By this time, the LEP experiments – ALEPH, DELPHI, L3 and OPAL – had established the Standard Model of particle physics with great precision. LEP had achieved its mission, and the only thing missing from the Standard Model was the elusive Higgs particle. Nobody knew whether the Higgs was within LEP’s reach, but detailed analysis suggested that its mass might be not much more than 100 GeV and that it would be produced in electron–positron collisions in association with a Z particle. In other words, the LEP experiments might have a chance of crowning their achievements with a spectacular discovery to start the new millennium.

There was nothing to lose and, as the 2000 run got underway, the machine was pushed to its limits. A cut-off date of 1 September had been set, and a closing celebration planned for the following month. Throughout the year, regular reports were made to the LEP Experiments Committee (LEPC), but there was no sign of a Higgs up to a mass of about 110 GeV. The decision was taken to push the beam energy beyond the limits through July and August: at this stage, if something broke, it really didn’t matter. And that was when the situation became exciting. A small excess of events was observed by the ALEPH experiment at a mass of about 114 GeV, but with no supporting evidence from the other experiments. Nevertheless, I telephoned Luciano to keep him informed that we might have an exciting time on our hands, and potentially a very difficult one! As a result of the ALEPH candidates, LEP’s final run was extended through to the end of October.

Luciano Maiani : 

I remember Roger’s call like it was yesterday. Whatever happened next was going to require some difficult decisions. In October, we celebrated the conclusion of the LEP programme in the presence of eminent representatives of the Member States, even though the machine was still running. ALEPH’s excess was still there so, after the speeches were done, we discreetly started to work out the cost of running LEP for another year, and the repercussions it would have on the construction of the LHC.

The problem was that LHC excavations would soon reach the LEP tunnel, so an extra year of running would mean that work would have to stop, contracts be terminated and penalties paid to the companies involved, not to mention the extra running costs that had not been budgeted for. In total, we worked out that it would cost some 120 MCHF, and deal a major psychological blow to the LHC community. We had no way of anticipating how the LHC experiments’ funding agencies would react to the news of a year’s delay.

As October progressed, the other LEP experiments did not see anything, and ALEPH did not find any more candidates. LEP’s illustrious career seemed to be coming to an uneventful end, but there was to be one final twist: towards the end of month, the L3 experiment announced an event that seemed to change everything. It was a two-jet event. Each jet contained a b quark, and there was missing energy corresponding to the mass of a Z particle. Significantly, the jets had the fateful energy of around 114 GeV.

L3’s event could be interpreted as the production of the same particle that ALEPH seemed to see decaying into a b-anti-b quark pair, with the accompanying Z decaying into two invisible neutrinos. In short, it could be another trace of the existence of the Higgs boson.

We discussed the L3 event thoroughly with LEPC Chair Michel Spiro and concluded that it was inconclusive. It could be a Higgs, but it could equally well be something much more mundane: there was no imbalance in transverse energy as there had been in the 1980s when Carlo Rubbia had announced the discovery of the Z boson. Without that, the missing energy could have been lost down the beam pipes and so gone undetected and, importantly, there were well-known electromagnetic processes that would produce just such an outcome.

Michel Spiro (left) and Roger Cashmore speaking at the LEP Fest, a celebration of the achievements of LEP on 10 October 2000. (Image: CERN)

The L3 event was not a smoking gun after all, and we were left at the end of the month with a very difficult decision to take. Whatever we decided, some part of the community would be disappointed. Events proceeded quickly. On 3 November, LEPC delivered its verdict: not conclusive. Similar verdicts were then delivered by the Research Board and the Scientific Policy Committee (SPC). The decision was left to us and, along with Roger and the whole Directorate, we made our decision. For us, LEP was over; the LHC was the best machine to tell us whether there was a Higgs at 114 GeV, or whether LEP had been chasing phantoms.

By 4 November I had already written to George Kalmus, the Chair of the SPC. “The idea that we may find ourselves in September 2001 with 3.5–4 sigma, CERN’s financial position aggravated, LHC delayed and LHC people disbanded is not very encouraging. I am not going to go this way.” On 17 November, we recommended no additional year of LEP running to the Committee of Council. Faced with the alternative of betting 120 MCHF on the roulette wheel of a few anomalous events, the Council wisely accepted our advice. LEP’s final year had been an emotionally charged rollercoaster ride. The lights never went out at CERN as analyses were refined around the clock and, when our decision became known, it was greeted with relief, shock and disbelief in equal measure. At the end of 2000, the Council’s decision moved us firmly into the LHC era, ready to fully explore the Higgs and much more.

 

thortala Tue, 05/31/2022 - 14:28 Byline Luciano Maiani Roger Cashmore Publication Date Tue, 05/31/2022 - 14:18

Three-quarters of the way there

Tue, 31/05/2022 - 12:04
Higgs10 Bulletin article - 3 (Image: CERN) Three-quarters of the way there

By: Matthew Chalmers

25 May, 2022 · Voir en français

In the third part of the Higgs10 series, we see how the direct discovery of the W and Z bosons at the SppS in 1983 provided solid experimental support for the existence of the Higgs boson

Press conference on the announcement of the W and Z bosons. From left to right: Carlo Rubbia, spokesman of the UA1 experiment; Simon van der Meer, responsible for developing the stochastic cooling technique; Herwig Schopper, Director-General of CERN; Erwin Gabathuler, Research Director at CERN, and Pierre Darriulat, spokesman of the UA2 experiment (Image: CERN)

4 July 2012 wasn’t the first time physicists had packed themselves into the CERN auditorium to witness the discovery of a new elementary particle. To rapturous applause on 20 January 1983, Carlo Rubbia, spokesperson of the UA1 experiment at the Spp-barS collider, presented six candidate events for the W boson, the electrically charged carrier of the weak interaction responsible for radioactive decay. In similar scenes the following afternoon, Luigi Di Lella of the UA2 experiment announced four W candidates. Along with the Z boson and massless photon, the W boson is one of three “gauge” bosons of a unified electroweak interaction that demands the existence of a fourth “scalar” particle called the Higgs boson.

Indirect evidence for the Z boson had been obtained a decade earlier at Gargamelle, driving the community to seek a direct discovery of the massive electroweak bosons. But their predicted masses – around 80 and 90 GeV for the W and Z, respectively – were beyond the reach of experiments at the time. In 1976, Rubbia, Peter McIntyre and David Cline suggested modifying the CERN SPS from a one-beam accelerator into a machine that would collide beams of protons and antiprotons, greatly increasing the available energy. Simon van der Meer had already invented a way of producing and storing dense beams of protons or antiprotons, while his “stochastic cooling” method to reduce the energy spread and angular divergence of the beams had been honed at the Intersecting Storage Rings (the world’s first hadron collider). Many doubted the wisdom of the decision, however, especially as CERN was keen to push its visionary Large Electron–Positron (LEP) collider.

First direct production of the W boson in the UA1 detector in late 1982 (Image: CERN)

As former UA2 spokesperson Pierre Darriulat wrote in CERN Courier in 2004: “The pressure to discover the W and Z was so strong that the long design, development and construction time of the LEP project left most of us, even the most patient, dissatisfied. A quick (but hopefully not dirty) look at the new bosons would have been highly welcome. But when proton–proton colliders such as the Superconducting Intersecting Storage Rings were proposed in this spirit, they were ‘killed in the egg’ by the management at CERN, with the argument that they would delay – or, even worse, endanger – the LEP project. The same argument did not apply to the proton–antiproton collider, as it did not require the construction of a new collider ring and could be proposed as an experiment … Another argument also made it possible for the proton–antiproton project to break the LEP taboo: if CERN did not buy Carlo’s idea, it was most likely that he would sell it to Fermilab.”

Two detectors, UA1 and UA2, built around the Spp-barS beam pipe to search for signatures of the W and Z particles, started taking collision data in 1981. When they confirmed the existence of the W boson – which was announced at a press conference at CERN on 25 January 1983, followed by the discovery of the Z boson a few months later and the Nobel Prize in Physics for Rubbia and Van der Meer the following year – the case for the existence of the Higgs boson grew stronger.

First direct production of the Z boson in the UA1 detector in April 1983 (Image: CERN)

That’s because all three bosons hail from the same “Mexican hat”-shaped Brout-Englert-Higgs (BEH) field that broke the electroweak symmetry a fraction of a nanosecond after the Big Bang and left the universe with a non-zero vacuum expectation value. As the universe transitioned from a symmetrical state at the top of the hat to a more stable configuration in the rim, three of the BEH field’s four mathematical components were absorbed to generate masses for the W and Z bosons (while keeping the photon massless); the fourth, corresponding to an otherworldly oscillation up and down the rim of the Mexican hat, is the Higgs boson.

In 1983, assuming that the electroweak Standard Model and BEH mechanism were correct, three quarters of the BEH field had been discovered. LEP went on to measure the properties of the W and Z bosons in great detail, helping to constrain the possible hiding places for the “remaining quarter”. The Standard Model does not predict the mass of the Higgs boson. Finding it would require an even more powerful machine. Thanks to the foresight of CERN Director-General John Adams in 1977, the LEP tunnel was designed to be large enough to accommodate the proton–proton collider that, 35 years later, would uncover the final quarter of the mysterious scalar field that pervades the universe and gives mass to elementary particles.

Carlo Rubbia, spokesperson for the UA1 experiment, announces six candidate W boson events in a seminar on 20 January 1983 (Video: CERN)

 

The direct discovery of the W and Z bosons at the SppS in 1983 provided solid experimental support for the existence of the Higgs boson

Higgs10: Three-quarters of the way there

Wed, 25/05/2022 - 12:18
Higgs10: Three-quarters of the way there

4 July 2012 wasn’t the first time physicists had packed themselves into the CERN auditorium to witness the discovery of a new elementary particle. To rapturous applause on 20 January 1983, Carlo Rubbia, spokesperson of the UA1 experiment at the Spp-barS collider, presented six candidate events for the W boson, the electrically charged carrier of the weak interaction responsible for radioactive decay. In similar scenes the following afternoon, Luigi Di Lella, spokesperson of the UA2 experiment, announced four W candidates. Along with the Z boson and massless photon, the W boson is one of three “gauge” bosons of a unified electroweak interaction that demands the existence of a fourth “scalar” particle called the Higgs boson.

Indirect evidence for the Z boson had been obtained a decade earlier at Gargamelle, driving the community to seek a direct discovery of the massive electroweak bosons. But their predicted masses – around 80 and 90 GeV for the W and Z, respectively – were beyond the reach of experiments at the time. In 1976, Rubbia, Peter McIntyre and David Cline suggested modifying the CERN SPS from a one-beam accelerator into a machine that would collide beams of protons and antiprotons, greatly increasing the available energy. Simon van der Meer had already invented a way of producing and storing dense beams of protons or antiprotons, while his “stochastic cooling” method to reduce the energy spread and angular divergence of the beams had been honed at the Intersecting Storage Rings (the world’s first hadron collider). Many doubted the wisdom of the decision, however, especially as CERN was keen to push its visionary Large Electron–Positron (LEP) collider.

First direct production of the W boson in the UA1 detector in late 1982 (Image: CERN)

As former UA2 spokesperson Pierre Darriulat wrote in CERN Courier in 2004: “The pressure to discover the W and Z was so strong that the long design, development and construction time of the LEP project left most of us, even the most patient, dissatisfied. A quick (but hopefully not dirty) look at the new bosons would have been highly welcome. But when proton–proton colliders such as the Superconducting Intersecting Storage Rings were proposed in this spirit, they were ‘killed in the egg’ by the management at CERN, with the argument that they would delay – or, even worse, endanger – the LEP project. The same argument did not apply to the proton–antiproton collider, as it did not require the construction of a new collider ring and could be proposed as an experiment … Another argument also made it possible for the proton–antiproton project to break the LEP taboo: if CERN did not buy Carlo’s idea, it was most likely that he would sell it to Fermilab.”

Two detectors, UA1 and UA2, built around the Spp-barS beam pipe to search for signatures of the W and Z particles, started taking collision data in 1981. When they confirmed the existence of the W boson – which was announced at a press conference at CERN on 25 January 1983, followed by the discovery of the Z boson a few months later and the Nobel Prize in Physics for Rubbia and van der Meer the following year –

First direct production of the Z boson in the UA1 detector in April 1983 (Image: CERN)

 the case for the existence of the Higgs boson grew stronger. That’s because all three bosons hail from the same “Mexican hat”-shaped Brout-Englert-Higgs (BEH) field that broke the electroweak symmetry a fraction of a nanosecond after the Big Bang and left the universe with a non-zero vacuum expectation value. As the universe transitioned from a symmetrical state at the top of the hat to a more stable configuration in the rim, three of the BEH field’s four mathematical components were absorbed to generate masses for the W and Z bosons (while keeping the photon massless); the fourth, corresponding to an otherworldly oscillation up and down the rim of the Mexican hat, is the Higgs boson.

In 1983, assuming that the electroweak Standard Model and BEH mechanism were correct, three quarters of the BEH field had been discovered. LEP went on to measure the properties of the W and Z bosons in great detail, helping to constrain the possible hiding places for the “remaining quarter”. The Standard Model does not predict the mass of the Higgs boson. Finding it would require an even more powerful machine. Thanks to the foresight of CERN Director-General John Adams in 1977, the LEP tunnel was designed to be large enough to accommodate the proton–proton collider that, 35 years later, would uncover the final quarter of the mysterious scalar field that pervades the universe and gives mass to elementary particles.

The announcement of the W and Z bosons in 1983 (Video: CERN)

 

thortala Wed, 05/25/2022 - 11:18 Byline Matthew Chalmers Publication Date Wed, 05/25/2022 - 11:06

The Higgs boson and the rise of the Standard Model of Particle Physics in the 1970s

Mon, 16/05/2022 - 15:46
At Gargamelle with Paul Musset (Image: CERN) The Higgs boson and the rise of the Standard Model of Particle Physics in the 1970s

By: John Ellis

10 May, 2022 · Voir en français

"At the dawn of the 1970s, the idea of a massive scalar boson as the keystone of a unified theoretical model of the weak and electromagnetic interactions had yet to become anchored in a field that was still learning to live with what we now know as the Standard Model of Particle Physics. As the various breakthroughs of the decade gradually consolidated this theoretical framework, the Brout–Englert–Higgs (BEH) field and its boson emerged as the most promising theoretical model to explain the origin of mass.

In the 1960s, there were remarkably few citations of the papers by Sheldon Glashow, Abdus Salam and Steven Weinberg on the theory of unified weak and electromagnetic interactions. All that changed, however, in 1971 and 1972 when, in Utrecht, Gerard ’t Hooft and Martinus Veltman (a former CERN staff member) proved that gauge theories employing the Brout-Englert-Higgs mechanism to generate masses for gauge bosons are renormalisable, and hence are mathematically consistent and can be used to make reliable, precise calculations for the weak interactions. This breakthrough was given broad publicity in an influential talk by Benjamin Lee of Fermilab during the ICHEP conference held there in 1972, in which he talked at length about “Higgs fields”.

Encouraged, in particular, by the CERN theorists Jacques Prentki and Bruno Zumino, the Gargamelle collaboration prioritised the search for weak neutral current interactions in the CERN neutrino beam, and their representative Paul Musset presented the first direct evidence for them in a seminar at CERN on 19 July 1973. This first experimental support for the unification of the electromagnetic and weak interactions attracted great interest and close scrutiny, but was generally accepted within a few months. The neutral-current discovery convinced physicists that the nascent Standard Model was on the right track. Former CERN Director-General Luciano Maiani, quoted in a 2013 CERN Courier article, puts it this way:

"At the start of the decade, people did not generally believe in a standard theory, even though theory had done everything. The neutral-current signals changed that. From then on, particle physics had to test the standard theory." – Luciano Maiani

 

Mary K. Gaillard (center), her granddaughter Cleo (left), and John Ellis (right), in 2019, during the celebration of Mary’s 80th birthday.(Image: Berkeley Science Review)

The next breakthrough came in 1974, when two experimental groups working in the United States, led by Sam Ting at Brookhaven and Burt Richter at SLAC, discovered a narrow vector resonance, the J/psi, with prominent decays into lepton–antilepton pairs. Many theoretical interpretations were proposed, which we at CERN discussed over the phone in excited midnight seminars with Fred Gilman at SLAC (almost 40 years before Zoom!). The winning interpretation was that the J/psi was a bound state of the charm quark and its antiquark. The existence of this fourth quark had been proposed by James Bjorken and Sheldon Glashow in 1964, and its use to suppress flavour-changing neutral weak interactions had been proposed by Glashow, John Iliopoulos and Maiani in 1970. Mary K. Gaillard (a long-term visiting scientist at CERN), Jon Rosner and Lee wrote an influential paper on the phenomenology of charm in 1974, and experiments gradually fell into line with their predictions, with final confirmation coming in 1976.

The attention of most of the theoretical and experimental communities was then drawn towards the search for the massive W and Z vector bosons responsible for the weak interactions. This motivated the construction of high-energy hadron colliders and led to the discovery of the W and Z bosons at CERN in 1983 by a team led by Carlo Rubbia.

However, it seemed to Mary K. Gaillard, Dimitri Nanopoulos and myself at CERN that the key question was not the existence of the massive weak vector bosons, but rather that of the scalar Higgs boson that enabled the Standard Model to be physically consistent and mathematically calculable. At the time, the number of papers on the phenomenology of the Higgs boson could be counted on the fingers of one hand, so we set out to describe its phenomenological profile in some detail, covering a wide range of possible masses. Among the production mechanisms we considered was the possible production of the Higgs boson in association with the Z boson, which generated considerable interest in the days of LEP 2. Among the Higgs decay modes we calculated was that into a pair of photons. This distinctive channel is particularly interesting because it is generated by quantum effects (loop diagrams) in the Standard Model.

Despite our conviction that something like the Higgs boson had to exist, our paper ended on a cautionary note that was somewhat tongue-in-cheek:

"We apologise to experimentalists for having no idea what is the mass of the Higgs boson … and for not being sure of its couplings to other particles, except that they are probably all very small. For these reasons we do not want to encourage big experimental searches for the Higgs boson, but we do feel that people performing experiments vulnerable to the Higgs boson should know how it may turn up."
John Ellis, Mary K. Gaillard and Dimitri Nanopoulos

This caution was in part because the senior physicists of the day (Dimitri and I were under 30 at the time) regarded the ideas surrounding electroweak symmetry breaking and the Higgs boson with rather jaundiced eyes. Nevertheless, as time went on, the massive W and Z were discovered, the existence or otherwise of the Higgs boson rose up the experimental agenda, and no plausible alternative theoretical suggestions to the existence of something like the Higgs boson emerged. Experimentalists, first at LEP and later at the Tevatron and the LHC, focused increasingly on searches for the Higgs boson as the final building block of the Standard Model, culminating in the discovery on 4 July 2012.

As the various breakthroughs of the 1970s gradually consolidated the Standard Model, the Brout–Englert–Higgs field and its boson emerged as the most promising theoretical model to explain the origin of mass

ALICE makes first direct observation of a fundamental effect in particle physics

Mon, 16/05/2022 - 13:09
ALICE makes first direct observation of a fundamental effect in particle physics

 

The ALICE collaboration at the Large Hadron Collider (LHC) has made the first direct observation of the dead-cone effect – a fundamental feature of the theory of the strong force that binds quarks and gluons together into protons, neutrons and, ultimately, all atomic nuclei. In addition to confirming this effect, the observation, reported in a paper published today in Nature, provides direct experimental access to the mass of a single charm quark before it is confined inside hadrons.

“It has been very challenging to observe the dead cone directly,” says ALICE spokesperson Luciano Musa. “But, by using three years’ worth of data from proton–proton collisions at the LHC and sophisticated data-analysis techniques, we have finally been able to uncover it.”

Quarks and gluons, collectively called partons, are produced in particle collisions such as those that take place at the LHC. After their creation, partons undergo a cascade of events called a parton shower, whereby they lose energy by emitting radiation in the form of gluons, which also emit gluons. The radiation pattern of this shower depends on the mass of the gluon-emitting parton and displays a region around the direction of flight of the parton where gluon emission is suppressed – the dead cone1.

Predicted thirty years ago from the first principles of the theory of the strong force, the dead cone has been indirectly observed at particle colliders. However, it has remained challenging to observe it directly from the parton shower’s radiation pattern. The main reasons for this are that the dead cone can be filled with the particles into which the emitting parton transforms, and that it is difficult to determine the changing direction of the parton throughout the shower process.

The ALICE collaboration overcame these challenges by applying state-of-the-art analysis techniques to a large sample of proton–proton collisions at the LHC. These techniques can roll the parton shower back in time from its end-products – the signals left in the ALICE detector by a spray of particles known as a jet. By looking for jets that included a particle containing a charm quark, the researchers were able to identify a jet created by this type of quark and trace back the quark’s entire history of gluon emissions. A comparison between the gluon-emission pattern of the charm quark with that of gluons and practically massless quarks then revealed a dead cone in the charm quark’s pattern.

The result also directly exposes the mass of the charm quark, as theory predicts that massless particles do not have corresponding dead cones.

“Quark masses are fundamental quantities in particle physics, but they cannot be accessed and measured directly in experiments because, with the exception of the top quark, quarks‌ are confined inside composite particles,” explains ALICE physics coordinator Andrea Dainese. “Our successful technique to directly observe a parton shower’s dead cone may offer a way to measure quark masses.”

As the parton shower proceeds, gluons are emitted at smaller angles and the energy of the quark decreases, resulting in larger dead cones of suppressed gluon emission. (Image: CERN)

Further information:

 

1Technical note: specifically, for an emitter of mass m and energy E, gluon emission is suppressed at angles smaller than the ratio of m and E, relative to the emitter’s direction of motion.

 

mailys Mon, 05/16/2022 - 12:09 Publication Date Wed, 05/18/2022 - 17:00

CLOUD discovers new way by which aerosols rapidly form and grow at high altitude

Fri, 13/05/2022 - 10:48
CLOUD discovers new way by which aerosols rapidly form and grow at high altitude

Aerosol particles can form and grow in Earth’s upper troposphere in an unexpected way, reports the CLOUD collaboration in a paper1 published today in Nature. The new mechanism may represent a major source of cloud and ice seed particles in areas of the upper troposphere where ammonia is efficiently transported vertically, such as over the Asian monsoon regions.

Aerosol particles are known to generally cool the climate by reflecting sunlight back into space and by making clouds more reflective. However, how new aerosol particles form in the atmosphere remains relatively poorly known.

“Newly formed aerosol particles are ubiquitous throughout the upper troposphere, but the vapours and mechanisms that drive the formation of these particles are not well understood,” explains CLOUD spokesperson Jasper Kirkby. “With experiments performed under cold upper tropospheric conditions in CERN’s CLOUD chamber, we uncovered a new mechanism for extremely rapid particle formation and growth involving novel mixtures of vapours.”

Using mixtures of sulfuric acid, nitric acid and ammonia vapours in the chamber at atmospheric concentrations, the CLOUD team found that these three compounds form new particles synergistically at rates much faster than those for any combination of two of the compounds. The CLOUD researchers found that the three vapours together form new particles 10–1000 times faster than a sulfuric acid–ammonia mixture, which, from previous CLOUD measurements, was previously considered to be the dominant source of upper tropospheric particles. Once the three-component particles form, they can grow rapidly from the condensation of nitric acid and ammonia alone to sizes where they seed clouds.

Moreover, the CLOUD measurements show that these particles are highly efficient at seeding ice crystals, comparable to desert dust particles, which are thought to be the most widespread and effective ice seeds in the atmosphere. When a supercooled cloud droplet freezes, the resulting ice particle will grow at the expense of any unfrozen droplets nearby, so ice has a major influence on cloud microphysical properties and precipitation.

The CLOUD researchers went on to feed their measurements into global aerosol models that include vertical transport of ammonia by deep convective clouds. The models showed that, although the particles form locally in ammonia-rich regions of the upper troposphere such as over the Asian monsoon regions, they travel from Asia to North America in just three days via the subtropical jet stream, potentially influencing Earth’s climate on an intercontinental scale.

“Our results will improve the reliability of global climate models in accounting for aerosol formation in the upper troposphere and in predicting how the climate will change in the

future,” says Kirkby. “Once again, CLOUD is finding that anthropogenic ammonia has a major influence on atmospheric aerosol particles, and our studies are informing policies for future air pollution regulations.”

Atmospheric concentrations of sulfuric acid, nitric acid and ammonia were much lower in the pre-industrial era than they are now, and each is likely to follow different concentration trajectories under future air pollution controls. Ammonia in the upper troposphere originates from livestock and fertiliser emissions – which are unregulated at present – and is carried aloft in convective cloud droplets, which release their ammonia upon freezing.

Simulation of aerosol particle formation during the Asian monsoon in a global aerosol model with efficient vertical transport of ammonia into the upper troposphere. Including a mixture of sulfuric acid, nitric acid and ammonia enhances upper-tropospheric particle number concentrations over the Asian monsoon region by a factor of 3–5 compared with the same model with only sulfuric acid and ammonia. (Image: CLOUD collaboration)

Pictures:https://cds.cern.ch/record/2806655

1Wang, M. et al. Synergistic HNO3–H2SO4–NH3 upper tropospheric particle formation. Nature, doi:10.1038/s41586-022-04605-4 (2022).

gfabre Fri, 05/13/2022 - 09:48 Publication Date Wed, 05/18/2022 - 17:30

Higgs10: The Higgs boson and the rise of the Standard Model of Particle Physics in the 1970s

Tue, 10/05/2022 - 17:13
Higgs10: The Higgs boson and the rise of the Standard Model of Particle Physics in the 1970s

At the dawn of the 1970s, the idea of a massive scalar boson as the keystone of a unified theoretical model of the weak and electromagnetic interactions had yet to become anchored in a field that was still learning to live with what we now know as the Standard Model of Particle Physics. As the various breakthroughs of the decade gradually consolidated this theoretical framework, the Brout–Englert–Higgs (BEH) field and its boson emerged as the most promising theoretical model to explain the origin of mass.

In the 1960s, there were remarkably few citations of the papers by Sheldon Glashow, Abdus Salam and Steven Weinberg on the theory of unified weak and electromagnetic interactions. All that changed, however, in 1971 and 1972 when, in Utrecht, Gerard ’t Hooft and Martinus Veltman (a former CERN staff member) proved that gauge theories employing the Brout-Englert-Higgs mechanism to generate masses for gauge bosons are renormalisable, and hence are mathematically consistent and can be used to make reliable, precise calculations for the weak interactions. This breakthrough was given broad publicity in an influential talk by Benjamin Lee of Fermilab during the ICHEP conference held there in 1972, in which he talked at length about “Higgs fields”.

Encouraged, in particular, by the CERN theorists Jacques Prentki and Bruno Zumino, the Gargamelle collaboration prioritised the search for weak neutral current interactions in the CERN neutrino beam, and their representative Paul Musset presented the first direct evidence for them in a seminar at CERN on 19 July 1973. This first experimental support for the unification of the electromagnetic and weak interactions attracted great interest and close scrutiny, but was generally accepted within a few months. The neutral-current discovery convinced physicists that the nascent Standard Model was on the right track. Former CERN Director-General Luciano Maiani, quoted in a 2013 CERN Courier article, puts it this way: “At the start of the decade, people did not generally believe in a standard theory, even though theory had done everything. The neutral-current signals changed that. From then on, particle physics had to test the standard theory.”

The next breakthrough came in 1974, when two experimental groups working in the United States, led by Sam Ting at Brookhaven and Burt Richter at SLAC, discovered a narrow vector resonance, the J/psi, with prominent decays into lepton–antilepton pairs. Many theoretical interpretations were proposed, which we at CERN discussed over the phone in excited midnight seminars with Fred Gilman at SLAC (almost 40 years before Zoom!). The winning interpretation was that the J/psi was a bound state of the charm quark and its antiquark. The existence of this fourth quark had been proposed by James Bjorken and Sheldon Glashow in 1964, and its use to suppress flavour-changing neutral weak interactions had been proposed by Glashow, John Iliopoulos and Maiani in 1970. Mary K. Gaillard (a long-term visiting scientist at CERN), Jon Rosner and Lee wrote an influential paper on the phenomenology of charm in 1974, and experiments gradually fell into line with their predictions, with final confirmation coming in 1976.

The attention of most of the theoretical and experimental communities was then drawn towards the search for the massive W and Z vector bosons responsible for the weak interactions. This motivated the construction of high-energy hadron colliders and led to the discovery of the W and Z bosons at CERN in 1983 by a team led by Carlo Rubbia.

However, it seemed to Mary K. Gaillard, Dimitri Nanopoulos and myself at CERN that the key question was not the existence of the massive weak vector bosons, but rather that of the scalar Higgs boson that enabled the Standard Model to be physically consistent and mathematically calculable. At the time, the number of papers on the phenomenology of the Higgs boson could be counted on the fingers of one hand, so we set out to describe its phenomenological profile in some detail, covering a wide range of possible masses. Among the production mechanisms we considered was the possible production of the Higgs boson in association with the Z boson, which generated considerable interest in the days of LEP 2. Among the Higgs decay modes we calculated was that into a pair of photons. This distinctive channel is particularly interesting because it is generated by quantum effects (loop diagrams) in the Standard Model.

Mary K. Gaillard (center), her granddaughter Cleo (left), and John Ellis (right), in 2019, during the celebration of Mary’s 80th birthday.(Image: Berkeley Science Review)

Despite our conviction that something like the Higgs boson had to exist, our paper ended on a cautionary note that was somewhat tongue-in-cheek: “We apologise to experimentalists for having no idea what is the mass of the Higgs boson … and for not being sure of its couplings to other particles, except that they are probably all very small. For these reasons we do not want to encourage big experimental searches for the Higgs boson, but we do feel that people performing experiments vulnerable to the Higgs boson should know how it may turn up.” This caution was in part because the senior physicists of the day (Dimitri and I were under 30 at the time) regarded the ideas surrounding electroweak symmetry breaking and the Higgs boson with rather jaundiced eyes. Nevertheless, as time went on, the massive W and Z were discovered, the existence or otherwise of the Higgs boson rose up the experimental agenda, and no plausible alternative theoretical suggestions to the existence of something like the Higgs boson emerged. Experimentalists, first at LEP and later at the Tevatron and the LHC, focused increasingly on searches for the Higgs boson as the final building block of the Standard Model, culminating in the discovery on 4 July 2012.

thortala Tue, 05/10/2022 - 16:13 Byline John Ellis Publication Date Tue, 05/10/2022 - 16:09

A boson is born

Mon, 09/05/2022 - 14:38
A graphic artistic view of the Brout-Englert-Higgs Field (Image: CERN) A boson is born

By: Matthew Chalmers

28 April, 2022

In the first part of the Higgs10 series, we look at the how the Higgs boson came to be

On 4 July 2012, half a century’s wait came to an end as the ATLAS and CMS experiments announced the discovery of the Higgs bosonCelebrate 10 years since this extraordinary achievement by learning more about the history that led up to it, the next steps in understanding the mysterious particle, and CERN’s role in this endeavour. The “Higgs10” series will walk you through this journey, starting with an account by CERN Courier editor, Matthew Chalmers, of the theorisation of the Higgs boson in the 1960s.

_______________

Theoretical physicists François Englert (left) and Peter Higgs at CERN on 4 July 2012, at the announcement of the discovery of a Higgs boson by the ATLAS and CMS experiments.  (Image: M. Brice/CERN)

It’s every theoretical physicist’s dream to conjure a new particle from mathematics and have it observed by an experiment. Few have scaled such heights, let alone had a particle named after them. In the CERN auditorium on 4 July 2012, Peter Higgs wiped a tear from his eye when the ATLAS and CMS results came in. The Higgs boson holds the record (48 years) among elementary particles for the time between prediction and discovery, going from an esoteric technicality to commanding the global spotlight at the world’s most powerful collider.

Revealing that the universe is pervaded by a stark “scalar” field responsible for generating the masses of elementary particles was never something Robert Brout and François Englert, and independently Peter Higgs, set out to do. Their short 1964 papers – one by Brout and Englert, two others by Higgs – concerned an important but niche problem of the day. “Of no obvious relevance to physics” was how an editor of Physics Letters is said to have remarked on rejecting one of Higgs’ manuscripts. The papers went from fewer than 50 citations by the turn of the decade to around 18 000 today.

At the time the “BEH” mechanism was being dreamt up independently in Brussels and Edinburgh – and in London by Gerald Guralnik, Carl Hagen and Tom Kibble – the Standard Model of particle physics was years away. Physicists were still trying to understand the menagerie of hadrons seen in cosmic-ray and early accelerator experiments, and the nature of the weak force. The success of quantum electrodynamics (QED) in describing electromagnetism drove theorists to seek similar “gauge-invariant” quantum field theories to describe the weak and strong interactions. But the equations ran into a problem: how to make the carriers of these short-range forces massive, and keep the photon of electromagnetism massless, without spoiling the all-important gauge symmetry underpinning QED.

The 1964 Peter Higgs paper that first predicted the existence of what would come to be known as the Higgs boson. (Image: paper, APS; logo, CERN)

It took a phenomenon called spontaneous symmetry breaking, inherent in superconductivity and superfluidity, to break the impasse. In 1960, Yoichiro Nambu showed how the “BCS” theory of superconductivity developed three years earlier by John Bardeen, Leon Cooper and John R. Schrieffer could be used to create masses for elementary particles, and Jeffrey Goldstone brought elementary scalar fields to the party, picturing the vacuum of the universe as a “Mexican hat” in which the lowest-energy state is not at the most symmetrical point at the peak of the hat but in its rim. It was an abstraction too far for soon-to-be CERN Director-General Viki Weisskopf, who is said by Brout to have quipped: “Particle physicists are so desperate these days that they have to borrow from the new things coming up in many-body theory like BCS. Perhaps something will come of it.”

Four years later, Brout, Englert and Higgs added the final piece of the puzzle by showing that a mathematical block called the Goldstone theorem, which had beset initial applications of spontaneous symmetry breaking to particle physics by implying the existence of unobserved massless particles, does not apply to gauge theories such as QED. Unaware that others were on the trail, Higgs sent a short paper on the idea to Physics Letters in July 1964 where it was accepted by Jacques Prentki, the editor based at CERN. In a second paper sent one week later, Higgs demonstrated the mathematics – but it was rejected. Shocked, Higgs sent it to Physical Review Letters, and added crucial material, in particular : “it is worth noting that an essential feature of this type of theory is the prediction of incomplete multiplets of scalar and vector bosons” – a reference to the Higgs boson that was almost never published. In a further twist of fate, Higgs’ second paper was received and accepted the same day (31 August 1964) that Physical Review Letters published Brout and Englert’s similarly titled work. Today, the scalar field that switched on a fraction of a nanosecond after the Big Bang, giving the universe a non-zero “vacuum expectation value”, is generally referred to as the BEH field, while the particle representing the quantum excitation of this field is popularly known as the Higgs boson.

The 1964 Brout-Englert paper (Image: APS)

In further Nobel-calibre feats, Steven Weinberg incorporated the BEH mechanism into electroweak theory developed also by Abdus Salam and Sheldon Glashow, which predicted the W and Z bosons, and Gerard ‘t Hooft and Martinus Veltman put the unified theory on solid mathematical foundations. The discovery of neutral currents in 1973 in Gargamelle at CERN and of the charm quark at Brookhaven and SLAC in 1974 gave rise to the elecroweak Standard Model. Flushing out and measuring its bosons took three major projects at CERN spanning three decades – the SPS proton-antiproton collider, LEP and the LHC. In the mid-1970s, John Ellis, Mary Gaillard and Dimitri Nanopoulos described how the Higgs boson might reveal itself, and experimentalists accepted the challenge.

The discovery of the Higgs boson at the LHC in 2012 ended one journey, but opened another fascinating adventure. Understanding this unique particle will take every last drop of LHC data, in addition to that of a “Higgs factory” that may follow. Is it elementary or composite? Is it alone, or does it have siblings? And what are the roles of the mysterious BEH field in the beginning and the fate of the universe?

“We’ve scratched the surface,” said Peter Higgs in 2019. “But we have clearly much more to discover.”

The Higgs boson holds the record (48 years) among elementary particles for the time between prediction and discovery, going from an esoteric technicality to commanding the global spotlight at the world’s most powerful collider

CMS tries out the seesaw

Wed, 04/05/2022 - 11:50
CMS tries out the seesaw

The CMS collaboration at the Large Hadron Collider (LHC) has carried out a new test on a model that was developed to explain the tiny mass of neutrinos, electrically neutral particles that change type as they travel through space.

In the Standard Model of particle physics, the particles that cannot be broken down into smaller constituents, such as quarks and electrons, gain their mass through their interactions with a fundamental field associated with the Higgs boson. The neutrinos are the exception here, however, as this Higgs mechanism cannot explain their mass. Physicists are therefore investigating alternative explanations for the mass of neutrinos.

One popular theoretical explanation is a mechanism that pairs up a known light neutrino with a hypothetical heavy neutrino. In this model, the heavier neutrino plays the part of a larger child on a seesaw, lifting the lighter neutrino to give it a small mass. But, for this seesaw model to work, the neutrinos would need to be Majorana particles, that is, their own antimatter particles.

In its recent study, the CMS team tested the seesaw model by searching for Majorana neutrinos produced through a specific process, called vector-boson fusion, in data from high-energy collisions at the LHC collected by the CMS detector between 2016 and 2018. If they took place, these collision events would result in two muons (heavier versions of the electron) that had the same electric charge, two ‘jets’ of particles that had a large total mass and were wide apart from one another, and no neutrino.

After identifying and subtracting a background of collision events that look almost the same as the sought-after events, the CMS researchers found no signs of Majorana neutrinos in the data. However, they were able to set new bounds on a parameter of the seesaw model that describes the quantum mixing between a known light neutrino and a hypothetical heavy neutrino.

The results include bounds that surpass those obtained in previous LHC searches for a heavy Majorana neutrino with a mass larger than 650 billion electronvolts (GeV), and the first direct limits for a heavy Majorana neutrino that has a mass larger than 2 trillion electronvolts (TeV) and up to 25 TeV.

With the LHC set to be back in collision mode this summer, after a successful restart on 22 April, the CMS team can look forward to collecting more data and trying out the seesaw again.

____

Find out more on the CMS website.

abelchio Wed, 05/04/2022 - 10:50 Byline Ana Lopes Publication Date Wed, 05/04/2022 - 10:42

Higgs@10 – A boson is born

Thu, 28/04/2022 - 12:12
Higgs@10 – A boson is born

On 4 July 2012, half a century’s wait came to an end as the ATLAS and CMS experiments announced the discovery of the Higgs boson. Celebrate 10 years since this extraordinary achievement by learning more about the history that led up to it, the next steps in understanding the mysterious particle, and CERN’s role in this endeavour. The “Higgs history” series of Bulletin articles will walk you through this journey, starting with an account by CERN Courier editor, Matthew Chalmers, of the theorisation of the Higgs boson in the 1960s.

_______________

It’s every theoretical physicist’s dream to conjure a new particle from mathematics and have it observed by an experiment. Few have scaled such heights, let alone had a particle named after them. In the CERN auditorium on 4 July 2012, Peter Higgs wiped a tear from his eye when the ATLAS and CMS results came in. The Higgs boson holds the record (48 years) among elementary particles for the time between prediction and discovery, going from an esoteric technicality to commanding the global spotlight at the world’s most powerful collider.

Revealing that the universe is pervaded by a stark “scalar” field responsible for generating the masses of elementary particles was never something Robert Brout and François Englert, and independently Peter Higgs, set out to do. Their short 1964 papers – one by Brout and Englert, two others by Higgs – concerned an important but niche problem of the day. “Of no obvious relevance to physics” was how an editor of Physics Letters is said to have remarked on rejecting one of Higgs’ manuscripts. The papers went from fewer than 50 citations by the turn of the decade to around 18 000 today.

At the time the “BEH” mechanism was being dreamt up independently in Brussels and Edinburgh – and in London by Gerald Guralnik, Carl Hagen and Tom Kibble – the Standard Model of particle physics was years away. Physicists were still trying to understand the menagerie of hadrons seen in cosmic-ray and early accelerator experiments, and the nature of the weak force. The success of quantum electrodynamics (QED) in describing electromagnetism drove theorists to seek similar “gauge-invariant” quantum field theories to describe the weak and strong interactions. But the equations ran into a problem: how to make the carriers of these short-range forces massive, and keep the photon of electromagnetism massless, without spoiling the all-important gauge symmetry underpinning QED.

It took a phenomenon called spontaneous symmetry breaking, inherent in superconductivity and superfluidity, to break the impasse. In 1960, Yoichiro Nambu showed how the “BCS” theory of superconductivity developed three years earlier by John Bardeen, Leon Cooper and John R. Schrieffer could be used to create masses for elementary particles, and Jeffrey Goldstone brought elementary scalar fields to the party, picturing the vacuum of the universe as a “Mexican hat” in which the lowest-energy state is not at the most symmetrical point at the peak of the hat but in its rim. It was an abstraction too far for soon-to-be CERN Director-General Viki Weisskopf, who is said by Brout to have quipped: “Particle physicists are so desperate these days that they have to borrow from the new things coming up in many-body theory like BCS. Perhaps something will come of it.”

The 1964 Brout-Englert paper (Image: APS)

Four years later, Brout, Englert and Higgs added the final piece of the puzzle by showing that a mathematical block called the Goldstone theorem, which had beset initial applications of spontaneous symmetry breaking to particle physics by implying the existence of unobserved massless particles, does not apply to gauge theories such as QED. Unaware that others were on the trail, Higgs sent a short paper on the idea to Physics Letters in July 1964 where it was accepted by Jacques Prentki, the editor based at CERN. In a second paper sent one week later, Higgs demonstrated the mathematics – but it was rejected. Shocked, Higgs sent it to Physical Review Letters, and added crucial material, in particular : “it is worth noting that an essential feature of this type of theory is the prediction of incomplete multiplets of scalar and vector bosons” – a reference to the Higgs boson that was almost never published. In a further twist of fate, Higgs’ second paper was received and accepted the same day (31 August 1964) that Physical Review Letters published Brout and Englert’s similarly titled work. Today, the scalar field that switched on a fraction of a nanosecond after the Big Bang, giving the universe a non-zero “vacuum expectation value”, is generally referred to as the BEH field, while the particle representing the quantum excitation of this field is popularly known as the Higgs boson.

In further Nobel-calibre feats, Steven Weinberg incorporated the BEH mechanism into electroweak theory developed also by Abdus Salam and Sheldon Glashow, which predicted the W and Z bosons, and Gerard ‘t Hooft and Martinus Veltman put the unified theory on solid mathematical foundations. The discovery of neutral currents in 1973 in Gargamelle at CERN and of the charm quark at SLAC in 1974 gave rise to the elecroweak Standard Model. Flushing out and measuring its bosons took three major projects at CERN spanning three decades – the SPS proton-antiproton collider, LEP and the LHC. In the mid-1970s, John Ellis, Mary Gaillard and Dimitri Nanopoulos described how the Higgs boson might reveal itself, and experimentalists accepted the challenge.

The discovery of the Higgs boson at the LHC in 2012 ended one journey, but opened another fascinating adventure. Understanding this unique particle will take every last drop of LHC data, in addition to that of a “Higgs factory” that may follow. Is it elementary or composite? Is it alone, or does it have siblings? And what are the roles of the mysterious BEH field in the beginning and the fate of the universe?

“We’ve scratched the surface,” said Peter Higgs in 2019. “But we have clearly much more to discover.”

thortala Thu, 04/28/2022 - 11:12 Byline Matthew Chalmers Publication Date Thu, 04/28/2022 - 10:44

CMS measures the mass of the top quark with unparalleled accuracy

Tue, 19/04/2022 - 11:12
CMS measures the mass of the top quark with unparalleled accuracy

The CMS collaboration at the Large Hadron Collider (LHC) has performed the most accurate ever measurement of the mass of the top quark – the heaviest known elementary particle. The latest CMS result estimates the value of the top-quark mass with an accuracy of about 0.22%. The substantial gain in accuracy comes from new analysis methods and improved procedures to consistently and simultaneously treat different uncertainties in the measurement.

The precise knowledge of the top-quark mass is of paramount importance to understand our world at the smallest scale. Knowing this heaviest elementary particle as intimately as possible is crucial because it allows testing of the internal consistency of the mathematical description of all elementary particles, called the Standard Model.

For example, if the masses of the W boson and Higgs boson are known accurately, the top-quark mass can be predicted by the Standard Model. Likewise, using the top-quark and Higgs-boson masses, the W-boson mass can be predicted. Interestingly, despite much progress, the theoretical-physics definition of mass, which has to do with the effect of quantum-physics corrections, is still tough to pin down for the top quark.

And remarkably, our knowledge of the very stability of our universe depends on our combined knowledge of the Higgs-boson and top-quark masses. We only know that the universe is very close to a metastable state with the precision of the current measurements of the top-quark mass. If the top-quark mass was even slightly different, the universe would be less stable in the long term, potentially eventually disappearing in a violent event similar to the Big Bang.

To make their latest measurement of the top-quark mass, using data from proton–proton LHC collisions collected by the CMS detector in 2016, the CMS team measured five different properties of collision events in which a pair of top quarks is produced, instead of the up to three properties that were measured in previous analyses. These properties depend on the top-quark mass.

Furthermore, the team performed an extremely precise calibration of the CMS data and gained an in-depth understanding of the remaining experimental and theoretical uncertainties and their interdependencies. With this innovative method, all of these uncertainties were also extracted during the mathematical fit that determines the final value of the top-quark mass, and this meant that some of the uncertainties could be estimated much more accurately. The result, 171.77±0.38 GeV, is consistent with the previous measurements and the prediction from the Standard Model.

The CMS collaboration has made a significant leap forward with this new method to measure the top-quark mass. The cutting-edge statistical treatment of uncertainties and the use of more properties have vastly improved the measurement. Another big step is expected when the new approach is applied to the more extensive dataset recorded by the CMS detector in 2017 and 2018.

_____

Read more on the CMS website.

abelchio Tue, 04/19/2022 - 10:12 Byline CMS collaboration Publication Date Tue, 04/19/2022 - 09:38

LHCb reveals secret of antimatter creation in cosmic collisions

Wed, 06/04/2022 - 16:02
LHCb reveals secret of antimatter creation in cosmic collisions

At the Quark Matter conference today and at the recent Rencontres de Moriond conference, the LHCb collaboration presented an analysis of particle collisions at the Large Hadron Collider (LHC) that may help determine whether or not any antimatter seen by experiments in space originates from the dark matter that holds galaxies such as the Milky Way together.

Space-based experiments such as the Alpha Magnetic Spectrometer (AMS), which was assembled at CERN and is installed on the International Space Station, have detected the fraction of antiprotons, the antimatter counterparts of protons, in high-energy particles called cosmic rays. These antiprotons could be created when dark-matter particles collide with each other, but they could also be formed in other instances, such as when protons collide with atomic nuclei in the interstellar medium, which is mainly made up of hydrogen and helium.

To find out whether or not any of these antiprotons originate from dark matter, physicists therefore have to estimate how often antiprotons are produced in collisions between protons and hydrogen as well as between protons and helium. While some measurements of the first have been made, and LHCb reported in 2017 the first-ever measurement of the second, that LHCb measurement involved only prompt antiproton production – that is, antiprotons produced right at the place where the collisions took place.

In their new study, the LHCb team looked also for antiprotons produced at some distance from the collision point, through the transformation, or “decay”, of particles called antihyperons into antiprotons. To make this new measurement and the previous one, the LHCb researchers, who usually use data from proton–proton collisions for their investigations, used instead data from proton–helium collisions obtained by injecting helium gas into the point where the two LHC proton beams would normally collide.

By analysing a sample of some 34 million proton–helium collisions and measuring the ratio of the production rate of antiprotons from antihyperon decays to that of prompt antiprotons, the LHCb researchers found that, at the collision energy scale of their measurement, the antiprotons produced via antihyperon decays contribute much more to the total antiproton production rate than the amount predicted by most models of antiproton production in proton–nucleus collisions.

“This result complements our previous measurement of prompt antiproton production, and it will improve the predictions of the models,” says LHCb spokesperson Chris Parkes. “This improvement may in turn help space-based experiments find evidence of dark matter.”

“Our technique of injecting gas into the LHCb collision point was originally conceived to measure the size of the proton beams,” says LHCb physics coordinator Niels Tuning. “It is really nice to see again that it also improves our knowledge of how often antimatter should be created in cosmic collisions between protons and atomic nuclei.”

Additional information

Video: 

https://videos.cern.ch/record/2295741

Pictures:

https://cds.cern.ch/record/2639202/files/201809-232_03.jpg?subformat=icon-1440

https://cds.cern.ch/record/2302374/files/201802-025_08.jpg?subformat=icon-1440

 

mailys Wed, 04/06/2022 - 15:02 Publication Date Thu, 04/07/2022 - 16:00

ATLAS strengthens its search for supersymmetry

Mon, 04/04/2022 - 18:40
ATLAS strengthens its search for supersymmetry

Where is all the new physics? In the decade since the Higgs boson’s discovery, there have been no statistically significant hints of new particles in data from the Large Hadron Collider (LHC). Could they be sneaking past the standard searches? At the recent Rencontres de Moriond conference, the ATLAS collaboration at the LHC presented several results of novel types of searches for particles predicted by supersymmetry.

Supersymmetry, or SUSY for short, is a promising theory that gives each elementary particle a “superpartner”, thus solving several problems in the current Standard Model of particle physics and even providing a possible candidate for dark matter. ATLAS’s new searches targeted charginos and neutralinos – the heavy superpartners of force-carrying particles in the Standard Model – and sleptons – the superpartners of Standard Model matter particles called leptons. If produced at the LHC, these particles would each transform, or “decay”, into Standard Model particles and the lightest neutralino, which does not further decay and is taken to be the dark-matter candidate.

ATLAS’s newest search for charginos and sleptons studied a particle-mass region previously unexplored due to a challenging background of Standard Model processes that mimics the signals from the sought-after particles. The ATLAS researchers designed dedicated searches for each of these SUSY particle types, using all the data recorded from Run 2 of the LHC and looking at the particles’ decays into two charged leptons (electrons or muons) and “missing energy” attributed to neutralinos. They used new methods to extract the putative signals from the background, including machine-learning techniques and “data-driven” approaches.

These searches revealed no significant excess above the Standard Model background. They allowed the ATLAS teams to exclude SUSY particle masses, including slepton masses up to 180 GeV. This slepton mass limit surpasses limits at low mass that were set by experiments at the LHC’s predecessor – the Large Electron–Positron (LEP) collider – and that have stood for nearly twenty years. Moreover, it rules out some of the scenarios that could explain the long-standing anomaly associated with the magnetic moment of the muon, which has recently been corroborated by the Muon g-2 experiment at Fermilab in the US.

ATLAS physicists have also released the results of a new search for chargino–neutralino pairs, following up on some previous small excesses seen in early analyses of Run 2 data. They studied collision events where the chargino and neutralino decay via W and Z bosons respectively, with the W boson decaying to “jets” of particles and the Z boson to a pair of leptons. When the mass difference between the produced neutralino and the lightest possible neutralino lies below the Z boson mass, it is harder to select the signal events and the backgrounds are more challenging to model. This is the first ATLAS result in this decay channel to target this difficult mass region. The search found no significant deviation from the Standard Model prediction and led to new bounds on SUSY particle masses.

With the LHC set to begin its third data-taking run, ATLAS physicists are looking forward to building on these exciting results to continue their SUSY searches, in particular by targeting SUSY models that are well motivated theoretically and offer solutions to existing tensions between measurements and Standard Model predictions.

_____

Read more on the ATLAS website.

abelchio Mon, 04/04/2022 - 17:40 Byline ATLAS collaboration Publication Date Wed, 04/06/2022 - 10:30

MoEDAL gets a new detector

Mon, 28/03/2022 - 15:40
MoEDAL gets a new detector

The MoEDAL collaboration at the Large Hadron Collider (LHC) is adding a new detector to its experiment, in time for the start of the next run of the collider this coming summer. Named as the MoEDAL Apparatus for Penetrating Particles, or MAPP for short, the new detector will expand the physics scope of MoEDAL to include searches for minicharged particles and long-lived particles.

MoEDAL’s current portfolio of searches for new unknown particles includes searches for magnetic monopoles, theoretical particles with a magnetic charge, and dyons, theoretical particles with both magnetic and electric charge. These searches are conducted using two detector systems, one consisting of detectors that track particles and measure their charge, and another comprising detectors that trap particles for further investigation.

Using these tracking and trapping detector systems, the MoEDAL team has chalked up several achievements, including narrowing the regions of where to look for point-like magnetic monopoles, the first search at a particle accelerator for dyons, and more recently the first search at a particle collider for Schwinger monopoles, which have a finite size.

The new MAPP detector, which is currently being installed in a tunnel adjacent to the LHC tunnel, consists of two main parts. One part, MAPP-mCP, will search for minicharged particles (mCP) – particles with a fractional charge as small as a thousandth of the electron’s charge – using scintillation bars. Another part of the detector, MAPP-LLP, will search for long-lived particles (LLP) employing so-called scintillator hodoscopes arranged in a ‘Russian doll’ configuration.

“MoEDAL-MAPP will allow us to explore many models of physics phenomena beyond the Standard Model of particle physics, in ways that are complementary to those of the other LHC detectors,” says MoEDAL spokesperson Jim Pinfold.

abelchio Mon, 03/28/2022 - 14:40 Byline Ana Lopes Publication Date Mon, 03/28/2022 - 14:40

Mass matters when quarks cross a quark–gluon plasma

Wed, 23/03/2022 - 17:37
Mass matters when quarks cross a quark–gluon plasma

Unlike electrons, quarks cannot wander freely in ordinary matter. They are confined by the strong force within hadrons such as the protons and neutrons that make up atomic nuclei. However, at very high energy densities, such as those that are achieved in collisions between nuclei at the Large Hadron Collider (LHC), a different phase of matter exists in which quarks and the mediators of the strong force, gluons, are not confined within hadrons. This form of matter, called a quark–gluon plasma, is thought to have filled the universe in the first few millionths of a second after the Big Bang, before atomic nuclei formed.

At the Rencontres de Moriond conference today, the ALICE collaboration at the LHC reported an analysis of head-on collisions between lead nuclei showing that quark mass matters when quarks cross a quark–gluon plasma.

Hadrons containing charm and beauty quarks, the heavier cousins of the up and down quarks that make up protons and neutrons, offer an excellent way to study the properties of the quark–gluon plasma, such as its density. A charm quark is much heavier than a proton, and a beauty quark is as heavy as five protons. These quarks are produced in the very first instants of the collisions between nuclei, before the formation of the quark–gluon plasma that they then traverse. Therefore, they interact with the plasma’s constituents throughout its entire evolution.

Just like electrically charged particles crossing an ordinary gas can tell us about its density, through the energy they lose in the crossing, heavy quarks can be used to determine the density of the quark–gluon plasma through the energy they lose in strong interactions with the plasma’s constituents. However, before using the energy loss in the plasma to measure the plasma’s density, physicists need to validate the theoretical description of this loss.

A fundamental prediction of the theory of the strong force is that quarks that have a larger mass lose less energy than their lighter counterparts because of a mechanism known as the dead-cone effect, which prevents the radiation of gluons and thus of energy in a cone around the quark’s direction of flight.

In their new study of head-on collisions between lead nuclei, the ALICE collaboration tested this prediction using measurements of charm-quark-containing particles called D mesons. They measured D mesons produced right after the collisions from initial charm quarks, called ‘prompt’ D mesons, as well as ‘non-prompt’ D mesons produced later in the decays of B mesons, which contain the heavier beauty quarks. They presented the measurements in terms of the nuclear modification factor, which is a scaled ratio of particle production in lead–lead collisions to that in proton–proton collisions (figure below). They found that the production of non-prompt D mesons (blue markers in the figure) in lead–lead collisions is less suppressed than that of prompt D mesons (red markers).

Comparison of the nuclear modification factor of D mesons produced from initial charm quarks (red) and from the decays of hadrons containing beauty quarks (blue), as a function of the particles’ transverse momentum. Particle-production suppression (deviation from unity) is attributed to quark interactions in the quark–gluon plasma. (Image: CERN)

These results are described well by models in which beauty quarks lose less energy than charm quarks in the quark–gluon plasma, because of their larger mass. They thus confirm the theoretical expectations of the role of quark mass in the interactions of quarks with the quark–gluon plasma. In addition, the measurements are sensitive to B mesons that have low energies. This is crucial when it comes to using beauty quarks to determine the density and other properties of the plasma.

Further measurements with the upgraded ALICE detector in the next run of the LHC, which is scheduled to start this coming summer, will help to better understand the theoretical description of the energy loss that quarks experience when they cross the quark–gluon plasma.

_____

Read more on the ALICE website.

abelchio Wed, 03/23/2022 - 16:37 Byline ALICE collaboration Publication Date Fri, 03/25/2022 - 09:50

ATLAS nets top quark produced together with a photon

Wed, 23/03/2022 - 17:29
ATLAS nets top quark produced together with a photon

The top quark is very special. It’s the heaviest known elementary particle and therefore strongly interacts with the Higgs boson. The top quark’s interactions with other particles provide promising leads for searches for physics beyond the Standard Model. By taking accurate measurements of its properties using rare processes, physicists can explore new physics phenomena at the highest energies.

At the ongoing Rencontres de Moriond conference, the ATLAS collaboration at the Large Hadron Collider (LHC) announced the observation of one of these rare processes: the production of a single top quark in association with a photon through the electroweak interaction. With a statistical significance well above five standard deviations, the result represents the first observation of top-quark–photon production. This achievement was far from straightforward, as the search for this process was dominated by a large number of background collision events that mimic top-quark–photon production.

In their new analysis, the ATLAS researchers analysed the full LHC Run 2 data set, recorded by the detector between 2015 and 2018. They focused on collision events where the top quark decays via a W boson to an electron or a muon and a neutrino, and to a bottom quark. They further narrowed their search by seeking out a particular characteristic of top-quark–photon events: a “forward jet”, which is a spray of particles that is commonly produced and travels at small angles to the LHC’s proton beams.

To separate the top-quark–photon events from the background events, the ATLAS researchers used a neural network, which receives as input a number of variables or features, and finds the combination of those features that most accurately classifies a data event according to signal or background types.

The statistical significance of the ATLAS measurement of top-quark–photon production is 9.1 standard deviations – well above the 5 standard-deviation threshold required to claim observation of a process in particle physics. The expected significance, based on the Standard Model prediction, was 6.7 standard deviations.

This exciting measurement will allow physicists to look for hints of new interactions that might exist beyond the reach of the LHC. In particular, physicists can now use this process to infer information on new particles that could alter the top-quark–photon interaction. Further studies with new analysis techniques and a significantly larger data set from the upcoming Run 3 of the LHC promise an exciting road ahead.

_____

Read more on the ATLAS website.

abelchio Wed, 03/23/2022 - 16:29 Byline ATLAS collaboration Publication Date Thu, 03/24/2022 - 11:00

ATLAS seeks out unusual signatures of long-lived particles

Tue, 22/03/2022 - 12:38
ATLAS seeks out unusual signatures of long-lived particles

High-energy collisions at the Large Hadron Collider (LHC) allow researchers to clearly study heavy Standard Model particles, like the Higgs boson, that decay almost immediately at the LHC collision point. However, new long-lived particles (LLPs) could travel sizeable distances through the ATLAS detector before decaying.

Studying the decay of any particle is a complex task, but it is usually made much easier by assuming that it decayed near the LHC collision point. This leaves LLPs in a blind spot, as they could decay anywhere in the detector. To ensure no stone is left unturned, ATLAS physicists have devised a range of new strategies to look for LLPs with various possible characteristics.

The hunt for right-handed neutrinos

Neutrinos have long puzzled physicists, as they have only ever been observed to be “left-handed” (i.e. their spin and momentum are opposed), while all other particles can also be observed in “right-handed” states. One possibility is that right-handed neutrinos exist but are very heavy, and therefore harder to produce in nature. These particles – called “heavy neutral leptons” (HNLs) – could also explain why neutrinos are so light.

In a new search for HNLs, ATLAS physicists looked for leptons originating from a common point a short distance from the collision point. The HNL could have decayed to a mixture of electrons, muons and missing energy. Using the decay products, they reconstructed the possible HNL mass and were able to set limits on masses between 3 and ~15 GeV. They also reported on HNL decays to electron–muon pairs for the very first time!

Harnessing the power of machine learning

If a new, neutral LLP were to decay to quarks in the outer layers of the calorimeter, it would leave behind sprays of collimated particles called “displaced” jets. These would leave an unusual signature in the detector: the jets would have no associated particle trajectories and would be very narrow compared to their Standard Model counterparts (see event display).

ATLAS researchers have exploited the uncommon characteristics of displaced jets to search for pairs of neutral LLPs. They developed novel machine-learning methods to distinguish displaced jets from background interactions. No significant excess of events has been spotted so far.

But what if the neutral LLP decays to leptons instead of quarks? “Dark photons” are a type of LLP believed to behave this way, and would leave behind collimated sprays of leptons in the detector, called “lepton-jets”. ATLAS’s newest search for dark photons uses machine-learning techniques that exploit patterns of raw energy deposits in each layer of the detector – a first for the collaboration. Although no excess of events was seen, physicists set stringent new limits on the existence of dark photons and were able to probe dark-photon decays to electrons for the very first time!

Following the steps of charged LLPs

When searching for new particles, physicists have to look for their decay products – or do they? If a heavy charged LLP exists, it would leave abnormally large energy deposits in the ATLAS tracking detector. This is an exceptional case where physicists could actually detect a new particle directly.

Result of the ATLAS search for a heavy charged LLP. The observed data (black) agree with the Standard Model expectation (blue line), except for a small excess of events in a high-energy and high-mass region (above 1000 GeV). (Image: ATLAS collaboration/CERN)

However, predicting the Standard Model background processes in this search is very challenging. To tackle the problem, ATLAS physicists employed a sophisticated “data-driven” method using tracks with regular energy deposits for comparison. The observed data agree with the Standard Model expectation, except for a small excess of events in a high-energy and high-mass region (see figure). Although intriguing, the measurements made indicate that none of the candidate events match the heavy new particle hypothesis. New searches in the works, and additional data, could shed more light on it.

Into Run 3

At the heart of these analyses is one key question: what if new particles are hiding from standard searches? ATLAS researchers have developed novel, creative ways to explore the rich diversity of possible LLP decays. The search continues, with Run 3 of the LHC promising new data and new innovations to further this exciting programme of research. 

 

Read the full version of this article here.

cagrigor Tue, 03/22/2022 - 11:38 Byline ATLAS collaboration Publication Date Tue, 03/22/2022 - 11:26