Last week, members of the EU’s CoE RAISE project met at CERN for their “All Hands” meeting. This innovative project is developing artificial-intelligence (AI) approaches for next-generation “exascale” supercomputers, for use across both science and industry. Use cases explored through the project include the optimisation of wind-farm layouts, design of efficient aircraft, improved sound engineering, seismic imaging with remote sensing, and more.
CoE RAISE – the European Center of Excellence in Exascale Computing “Research on AI- and Simulation-Based Engineering at Exascale” – is funded under the EU’s Horizon 2020 research and innovation programme. The project launched in 2021 and runs for three years.
The four-day meeting, which took place in CERN’s Council Chamber, was attended by 54 project members. The participants discussed progress made in their work to develop AI technologies for complex applications in Europe running on future “exascale” high-performance computing (HPC) systems. Exascale refers to the next generation of high-performance computers that can carry out over 1018 floating-point operations per second (FLOPS). Today, only the Frontier supercomputer at Oak Ridge National Laboratory in the United States has reached this level. However, with more exascale HPC systems just over the horizon, it is important to ensure that AI approaches used in science and industry are ready to capitalise fully on the enormous potential. In June, the European High Performance Computing Joint Undertaking (EuroHPC JU) announced that Forschungszentrum Jülich GmbH in Germany has been selected to host and operate Europe’s first exascale supercomputer, which is set to come online next year and will be known as JUPITER (the Joint Undertaking Pioneer for Innovative and Transformative Exascale Research).
CoE RAISE is developing innovative AI methods on heterogeneous HPC architectures involving multiple kinds of processor. Such architectures can offer higher performance and energy efficiency, but code must be adapted to use the different types of processors efficiently. The AI methods being developed are focused around nine key use cases and designed to scale well for running on exascale HPC systems.
CoE RAISE supports technology transfer to industry, particularly small- and medium-sized enterprises, as well as running education and training initiatives. On top of this, CoE RAISE also provides consulting and liaises with other European initiatives to maximise synergies, exploit opportunities for co-design and share knowledge. All aspects of the project’s work were discussed over the four days at CERN.
CERN is also a partner and brings one of the use cases to the project. This work focuses on the improvement of methods for reconstructing particle-collision events at the upgraded High-Luminosity Large Hadron Collider (HL-LHC), which is set to come online in 2029. The HL-LHC will see more particle collisions than ever taking place, producing exabytes of data each year, resulting in unprecedented computing challenges. To reconstruct particle collision events today (with data sets in the order of terabytes or petabytes), hundreds of different algorithms run concurrently: some are traditional algorithms optimised for particular hardware configurations, while others already include AI-driven methods, such as deep neural networks (DNNs). The members of the project team at CERN are working to increase the modularity of systems and ensure that code is optimised to fully exploit heterogeneous architectures, as well as increasing the use of machine learning and other AI methods for reconstruction of collisions and classification of particles.
“Supercomputers are reaching the exascale and enabling the delivery of an unprecedented scale of processing resources for HPC and AI workflows,” says Maria Girone, CERN openlab CTO, who leads CERN’s contribution to the project. “The research performed in CoE RAISE will drive the co-design of HPC computing resources for future AI and HPC applications for both science and industry. This meeting enabled us to exchange and develop ideas and to bring new perspectives. It also gave researchers from other domains a unique insight into the environment and challenges facing CERN, promoting cross-fertilisation and understanding.”thortala Wed, 01/25/2023 - 09:47 Byline Andrew Purcell Publication Date Wed, 01/25/2023 - 09:43
In the future, autonomous or self-driving cars are expected to considerably reduce the number of road accident fatalities. Advancing developments on this revolutionary road, CERN and car-safety software company Zenseact have just completed a three-year project researching machine-learning models to enable self-driving cars to make better decisions faster and thus avoid collisions.
When it comes to capturing data from collisions, CERN also requires fast and efficient decision making while analysing the millions of particle collisions produced in the Large Hadron Collider (LHC) detectors. Its unique capabilities in data analysis are what brought CERN and Zenseact together to investigate how the high-energy physics organisation’s machine-learning techniques could be applied to the field of autonomous driving. Focusing on “computer vision”, which helps the car analyse and respond to its external environment, the goal of this collaboration was to make deep-learning techniques faster and more accurate.
“Deep learning has strongly reshaped computer vision in the last decade, and the accuracy of image-recognition applications is now at unprecedented levels. But the results of our research with CERN show that there’s still room for improvement when it comes to autonomous vehicles,” says Christoffer Petersson, Research Lead at Zenseact.
For processing the computer vision tasks, chips known as field-programmable gate arrays (FPGAs) were chosen as the hardware benchmark. FPGAs, which have been used at CERN for many years, are configurable integrated circuits that can execute complex decision-making algorithms in micro-seconds. The researchers found that significantly more functionality could be packed into the FPGA by optimising existing resources. The best part is that tasks could be performed with high accuracy and short latency, even on a processing unit with limited computational resources.An FPGA-based readout card for the CMS tracker at CERN (Image: CERN)
“Our work together elucidated compression techniques in FPGAs that could also have a significant effect on increasing processing efficiency in the LHC data centres. With machine-learning platforms setting the stage for next-generation solutions, future development of this research area could be a major contribution to multiple other domains, beyond high-energy physics,” says Maurizio Pierini, Physicist at CERN.
The same techniques can also be used to improve algorithmic efficiency while maintaining accuracy in a wide range of domains, from energy efficiency gains in data centres to cell screening for medical applications.
Colliding particles not cars: CERN's machine learning could help self-driving cars (Video: CERN)
CERN’s technologies and expertise are available for scientific and commercial purposes through a variety of technology transfer opportunities. Visit cern.kt for more information.
ndinmore Wed, 01/25/2023 - 09:33 Byline Priyanka Dasgupta Publication Date Wed, 01/25/2023 - 09:21
Much of the research and development work for the High-Luminosity LHC (HL-LHC) aims to protect the accelerator’s fragile components from the detrimental effects of high luminosity, such as radiation and increased heat. The installation of the “MKI-Cool” LHC kicker magnet (MKI), the first of a series of eight, during this year-end technical stop marks another successful milestone in this endeavour: a water-cooled toroidal ferrite cylinder will lower the projected heat load deposited onto the kicker magnet yokes to allow the HL-LHC to operate well.
The LHC’s two kicker systems, each made up of four magnets and pulse generators, are found at the intersection of the LHC ring and its two transfer lines that funnel particles from the SPS. As their name suggests, kicker magnets give each injected bunch a kick to put it on the LHC orbit. As the kick must leave the circulating LHC beam untouched, each magnetic field pulse lasts only 8 microseconds: kicker magnets must be swift and precisely timed.
These specs make kicker magnets particularly vulnerable to their harsh environment: while the highly energetic LHC beam whizzes through the magnet’s aperture, the MKI cannot be completely shielded from the beam-induced heating, as a shielding would interfere with its high-frequency magnetic field pulse. In addition, the high-voltage pulse required precludes any water-cooling of the MKI yoke, which constitutes a serious hurdle as the yoke loses its magnetic properties above the critical temperature of 125 °C. Under these conditions, with dysfunctional kickers, mis-kicked particles would cause quenches in the LHC superconducting magnets. Measures have already been taken to avert this risk in the current accelerator, but they would not suffice to guard the magnets from the expected four-fold increase of the heat load in the context of higher luminosity.Upstream end of the MKI-Cool, showing the water-cooled ferrite cylinder. (Image: CERN)
The MKI-Cool system is an ingenuous solution thought up by the Accelerator Beam Transfer group (SY-ABT) to sustainably protect the kicker magnets. A toroidal ferrite cylinder is mounted upstream of the MKI-Cool aperture to absorb a significant portion of the beam-induced heating, thus reducing the heating of the magnet’s yoke. In addition, the ferrite cylinder is cooled using water (hence the name, MKI-Cool). The MKI-Cool ferrite yoke is expected to stay below a temperature of 100 °C, even with high-luminosity beams.
The first MKI-Cool was installed at LHC Point 8 on 11 January, replacing a standard MKI. Once the LHC is restarted, the interaction of the particle beam with the MKI-Cool will test the technology’s performance. Provided this test yields positive results, the seven remaining MKI-Cools will be installed before the start of HL-LHC operations.thortala Wed, 01/25/2023 - 08:57 Publication Date Wed, 01/25/2023 - 08:55
Another success for the HL-LHC magnet programme: after the successful endurance test of a 4.2-metre-long niobium–tin quadrupole magnet in the United States in spring 2022, the HL-LHC quadrupole’s longer version proved its worth later in the year. “MQXFBP3”, the third full-length quadrupole prototype to be tested at SM18, reached nominal current plus an operational margin in September–October 2022, confirming the success of the niobium–tin technology for superconducting magnets.
MQXFBP3 is the third in the series of HL-LHC triplet quadrupoles that have been produced and tested at CERN in recent years. These 7.2-metre-long superconducting magnets, along with their shorter counterparts currently being produced in the United States, will focus proton beams more tightly around the ATLAS and CMS collision points to allow the tenfold increase in integrated luminosity (the number of collisions) targeted by the HL-LHC.
The first two magnets tested at CERN fell short of reaching nominal current, which prompted the Accelerator Technology department’s magnet group to improve the design and the assembly processes of its prototypes as part of the so-called “three-leg strategy”. The magnet cold mass was reworked to reduce the coupling between the welded outer stainless-steel shell and the aluminium structure of the magnet.The MQXFBP3 magnet on its way to reaching nominal current in SM18. (Image: CERN)
This updated version – the third prototype – was able to reach nominal current (corresponding to 7 TeV in operation) plus 300 A of operational margin with only one training quench at 1.9 K. This is the first MQXF cold mass assembly, tested horizontally with a welded outer shell (as in the final configuration), to achieve this performance, which corresponds to a peak field in the coil of 11.5 T. The magnet has been subjected to two warm-up–cooldown cycles, showing no performance degradation. Even though the magnet satisfies the acceptance criteria for operation in HL-LHC, the magnet was limited 3% below nominal current at 4.5 K. The localisation and phenomenology of these quenches is very similar to those of the limiting quenches of the first and second MQXFB prototypes.
After the test, the magnet was removed from its stainless steel shell and is now being assembled with the nested dipole orbit corrector, which was provided by the Spanish institution CIEMAT. A new test in this configuration will be carried out in mid-2023. Should the test confirm its performance, MQXFBP3 will be the second Q2 cryomagnet to be installed in the IT (inner triplet) STRING.
The positive outcome of the recent test is cause for satisfaction and relief, especially as niobium–tin technologies, known to be more brittle than niobium–copper components, have come under particular scrutiny. Even so, engineers in the magnet group have more tricks up their sleeves to bring the performance of the 7.2-m-long MQXFB to the same levels obtained in the short models and in the 4.2-m-long magnets manufactured in the US: MQXFB02, the stage-two magnet of the three-leg strategy, will include further technical improvements in the magnet assembly to eliminate the coil overstress during keying and bladdering operations that was observed on the first three prototypes. The magnet community is eagerly awaiting the outcome of the magnet’s powering tests, which will continue throughout the first months of 2023 at SM18 – stay tuned!Timelapse insertion of a High-Luminosity Third Nb3Sn quadrupole prototype. (Video: CERN)
thortala Wed, 01/25/2023 - 08:47 Publication Date Wed, 01/25/2023 - 08:45
Exactly four decades ago today, on 25 January 1983, physicists at CERN announced to the world that they had observed a new elementary particle – the W boson. Together with its electrically neutral counterpart, the Z boson, which was discovered later in the same year, the electrically charged W boson mediates the weak force, one of nature’s four fundamental forces.
Through this force, the W boson enables the nuclear fusion reaction that powers the Sun, without which life as we know it would not be possible. The W boson is also responsible for a form of radioactivity, called radioactive beta decay, that is widely used in medicine.
The W boson’s discovery was the result of an idea proposed in 1976 by Carlo Rubbia, Peter McIntyre and David Cline. The trio of physicists suggested converting CERN’s largest accelerator at the time, the Super Proton Synchrotron (SPS), from an accelerator of protons into a machine to collide protons and antiprotons (the protons’ antimatter equivalents) at a high enough energy to produce W and Z bosons. Together with Simon van der Meer’s ingenious “stochastic cooling” technique, which made it possible to reduce the size and increase the density of a proton and, later, an antiproton beam, this bold idea allowed the UA1 and UA2 experiments that were built around the converted SPS to begin hunting for the W and Z bosons in 1981.
Two years later, in a seminar on 20 January 1983 held in CERN’s Main Auditorium, Rubbia, spokesperson of the UA1 collaboration, revealed six candidate collision events for the W boson. The following afternoon, Luigi Di Lella of the UA2 collaboration presented four candidate W events and, on 25 January 1983, CERN delivered the news of the discovery of the new particle to the world.
And if that wasn’t enough to celebrate and crown the success of the converted SPS, the W boson discovery was followed a few months later by that of the Z boson, indirect evidence for which had been obtained a decade earlier at CERN’s Gargamelle bubble chamber.
The observations of the W and Z bosons further confirmed the theory of the electroweak interaction that unifies the electromagnetic and weak forces and demands the existence of the Higgs boson, which was found at the Large Hadron Collider (LHC) in 2012. Developed in the 1960s by Sheldon Glashow, Abdus Salam and Steven Weinberg and cemented in the 1970s by Gerard ‘t Hooft and Martinus Veltman, this theory is now a cornerstone of the Standard Model of particle physics.
The W and Z discoveries were recognised with the 1984 Nobel Prize in Physics for Rubbia and Van der Meer, and helped secure the decision to build CERN’s next big accelerator, the Large Electron–Positron Collider (LEP), which went on to study the W and Z bosons in detail.
Forty years on, and after many investigations at LEP and other colliders, including the LHC, the W and Z bosons continue to show their stripes and provide physicists with new ways of exploring the properties and behaviour of matter at the smallest scales.
To give a couple of examples, in 2021 the ATLAS collaboration reported the observation of the rare simultaneous production of three W bosons, and CMS obtained a high-precision measurement of the transformation of Z bosons into invisible particles. And in 2022, based on data collected by the former Tevatron accelerator, the CDF collaboration announced the most precise ever measurement of the W boson mass. However, the CDF W boson mass value is in tension with previous results, including the first at the LHC by ATLAS and LHCb, calling for new measurements with increased precision.
Research into these and other facets of the W and Z bosons will continue at the LHC and its planned upgrade, the High-Luminosity LHC.
Carlo Rubbia, spokesperson of the UA1 collaboration, revealing six candidate W boson events in a seminar on 20 January 1983. (Video: CERN)
Read the CERN Courier article remembering the discovery of the particle.abelchio Tue, 01/24/2023 - 15:52 Byline Ana Lopes Publication Date Wed, 01/25/2023 - 10:00
Have you ever wondered about the purpose of the “Animal Shelter for Computer Mice” set up on the lawn in front of the CERN Data Centre? Put in place more than 10 years ago and resurrected once, it is intended as a monument to pure and perfect computer protection, as well as being a refuge for orphaned computer mice that constitute a primary risk to your computer and your computing account*.
Coupled with human curiosity ─ and as a CERN staff member or user you should be overflowing with such curiosity ─ just one innocent click of your computer mouse can put your digital life in danger. One click, to get malware installed. One click, to compromise your laptop. One click, to lose all your documents and data. One click, to hand over access to your account to some malicious person. Digital life gone. R.I.P. Game over. And game over for the Organization, too. Hence, we repeatedly warn people to “STOP ─ THINK ─ DON’T CLICK” when faced with unrelated URLs, unsolicited links or unexpected attachments (see here, here, here and there). Clicking is risky. And your computer mouse is a dangerous collaborator.
Recently, however, we’ve been informed that it’s not only computer mice that pose a threat to your digital well-being. No, trackpads do, too:
“I write to you with the greatest urgency. Dutifully following CERN’s advice (like any self-respecting scientist), I have disconnected my mouse and sent it to the animal shelter. Yet, I find myself still at risk of clicking. Sacrilegiously, my laptop has a trackpad. I feel that it is my calling in life to raise the alarm when it comes to trackpads allowing for clicking. It is an utter torment. While I have discovered this relationship, I have failed to find potential solutions. It would seem, now that mice have been banished to the shelters, trackpads are going to take over the world. Given CERN’s successful de-micing campaign, I request you – again with the highest urgency – to help us find a solution against the scourge that trackpads now represent.”
Right this sender is. Indeed, trackpads are just as dangerous as computer mice. One click, and your digital life is gone. However, digging deeper into the biology of IT, we have come to the conclusion that trackpads are not the concern of the animal shelter ... as they are not animals but plants. Computer mice move and, hence, like any other moving object, can be considered animals. Your trackpad does not move at all and is hence, based on the aforementioned definition, a plant. Unfortunately, CERN has not yet created a specific garden for trackpad plants...
For the moment, therefore, we urge any concerned scientist who fears a computer security problem caused by trackpads ─ combined with their own inherent curiosity ─ to reinforce “STOP ─ THINK ─ DON'T CLICK” by putting (at their own risk!) a big thick sticker over their trackpad or covering the trackpad with a layer of glue, Nutella or, better, Vegemite.
Whatever you do, make “STOP ─ THINK ─ DON'T CLICK” your mantra when browsing the web, opening emails or faced with links and URLs. Pause for a second in order to protect your digital life and your work for CERN. “STOP ─ THINK ─ DON'T CLICK”, with either your trackpad or your mouse. Thank you.
*Computer mice continue to be accepted at the shelter.
Do you want to learn more about computer security incidents and issues at CERN? Follow our Monthly Report. For further information, questions or help, check our website or contact us at Computer.Security@cern.ch.anschaef Tue, 01/24/2023 - 11:54 Byline Computer Security team Publication Date Tue, 01/24/2023 - 11:51
When the Large Hadron Collider (LHC) is operating, it produces more than one billion proton–proton interactions every second. But exactly how many take place in the LHC experiments? Critical to every analysis of LHC data is a high-precision measurement of what is known as luminosity, that is, the total number of proton–proton interactions in a given dataset. It allows physicists to evaluate the probability of interesting proton–proton collision events occurring, as well as to predict the rates of similar-looking background processes. Isolating such events from the background processes is crucial for both searches for new phenomena and precision measurements of known Standard Model processes.
The ATLAS collaboration has recently released its most precise luminosity measurement to date. They studied data taken over the course of four years (2015–2018), covering the entire Run 2 of the LHC, to assess the total amount of luminosity delivered to the ATLAS experiment in that dataset.
What exactly did this measurement entail? When proton beams circulate in the LHC, they are arranged in “bunches” each containing more than 100 billion protons. As two bunches circulating in opposite directions cross, some of the protons interact. Determining how many interactions there are in each bunch crossing provides a measure of the luminosity. Its value depends on the number of protons per bunch, how tightly squeezed the protons are and the angle at which the bunches cross. The luminosity also depends on the number of colliding proton bunches in each beam.
ATLAS has several detectors that are sensitive to the number of particles produced in proton–proton interactions, and the average number of measured particles is often proportional to the average number of proton–proton interactions per bunch crossing. Researchers can therefore use this average to monitor the “instantaneous” luminosity in real time during data-taking periods, and to measure the cumulative (“integrated”) luminosity over longer periods of time.
While ATLAS’s luminosity-sensitive detectors provided relative measurements of the luminosity during data taking, measurement of the absolute luminosity required a special LHC beam configuration that allows the detector signals to be calibrated. Once a year, the LHC proton beams are displaced from their normal position in order to record the particle counts in the luminosity detectors. This method is called a van der Meer beam separation scan, named after physics Nobel Prize winner Simon van der Meer, who developed the idea in the 1960s for application at CERN’s Intersecting Storage Rings. It allows researchers to estimate the size of the beam and measure how densely the protons are packed in the bunches. With that information in hand, they can calibrate the detector signals.
Working in close collaboration with ATLAS researchers, LHC experts carried out van der Meer scans under low-luminosity conditions, with an average of about 0.5 proton–proton interactions per bunch crossing and very long gaps between the bunches. For comparison, the LHC typically operates with 20–50 interactions per bunch crossing, and with bunches closer together in a “train” structure. The researchers therefore need to extrapolate the results of the van der Meer scans to the normal data-taking regime using the measurements from the luminosity-sensitive detectors.
Using this approach, and after careful evaluation of the systematic effects that can influence a luminosity measurement, ATLAS physicists determined the integrated luminosity of the full Run 2 dataset that had been recorded by ATLAS and certified as good for physics analysis, to be 140.1 ± 1.2 fb–1. For comparison, 1 inverse femtobarn (fb–1) corresponds to about 100 trillion proton–proton collisions. With its uncertainty of 0.83%, the result represents the most precise luminosity measurement at a hadron collider to date. It improves upon previous ATLAS measurements by a factor of 2 and is comparable with results achieved at the ISR experiments (0.9%).abelchio Tue, 01/24/2023 - 10:12 Byline ATLAS collaboration Publication Date Tue, 01/24/2023 - 09:59
The International Day of Women and Girls in Science was adopted by the United Nations General Assembly in order to promote full and equal access and participation for women and girls in science. 11 February is an opportunity to celebrate the essential role that women and girls play in science and technology.
For the seventh year running, CERN is getting involved by organising local activities for all ages.
Presentations in local schoolsA CERN scientist presents the Organization in an elementary school.
From 30 January to 3 February 2023, around a hundred volunteers – all female scientists and engineers from CERN, Scienscope (UNIGE), the École Polytechnique Fédérale de Lausanne (EPFL) and the Annecy Particle Physics Laboratory (LAPP) – will be visiting some 240 schools in the canton of Geneva, the Nyon area, the Pays de Gex and the Annecy conurbation to talk to the pupils about their professions.
They will talk about their career history and the projects and experiments in which they are involved, and in some cases give a short demonstration. The aim is to change how young people in our region view scientific, technical and technological professions and to show them that careers in science, technology, engineering and maths are just as accessible to girls as to boys. And, who knows, the presentations might even help some to discover their vocation!
Show “La Forza Nascosta – Scienziate nella Fisica e nella Storia” (suitable for all ages)
CERN will host the show “La Forza Nascosta – Scienziate nella Fisica e nella Storia” (The Hidden Force – Women Scientists in Physics and History) at 8.00 p.m. on Wednesday, 8 February in the Globe of Science and Innovation. This musical theatre production tells the story of physics in the twentieth century through the eyes of four renowned women scientists: astronomer Vera Cooper Rubin, nuclear physicist Marietta Blau, and particle physicists Chien-Shiung Wu and Milla Baldo Ceolin.
The show, in Italian with English subtitles, has been conceived of, written and promoted by a group of women physicists from the Italian National Institute for Nuclear Physics and the University of Turin’s Physics Department.
For more information and to register: http://voisins.cern/en/events
Interactive theatre-forum “Coffee Machine” event (for the CERN community)
The CERN Diversity & Inclusion Programme, in collaboration with the diversity offices of ALICE, CMS and LHCb, will host the interactive theatre-forum “Coffee Machine” event, open to all CERN personnel.
The event aims to raise awareness of how sexist behaviour can limit the full participation of women in the workplace. Through an interactive theatre piece performed by a Geneva-based group, we will observe subtle forms of such behaviour and learn how to intervene in a timely and effective manner. This creative activity reminds us how we can all contribute towards an inclusive and conducive scientific research environment.
The event will take place on 9 February from 2.00 to 4.00 p.m. in the Globe. Places are limited, so for more information and to register please click here.thortala Tue, 01/24/2023 - 09:01 Publication Date Tue, 01/24/2023 - 08:51
At CERN, ionising radiation is produced by the collision of particle beams with matter. CERN’s unique facilities require innovative approaches to minimising the exposure of workers, the public and the environment, making CERN one of the recognised leaders in this field. CERN’s radiation protection (RP) is in line with best practice in Europe, and staying at the forefront in this area is a priority for the Organization.
In addition to the many RP controls and measures in place, CERN is equipped with a robust radiation and environmental monitoring system (REMS) to provide it with all means necessary to protect the public, the people working on site and the environment – both when the accelerators are being exploited to their full physics potential and during shutdowns.
CERN’s legacy monitoring system – the Area Controller (ARCON) used since the 1980s – was replaced in 2021, with the completion of a project that started in 2014: CERN Radiation Monitoring Electronics (CROME). This brand-new generation of radiation monitoring systems was developed fully in house by a small team in the HSE Radiation Protection group (HSE-RP) with specific experience and know-how in measuring mixed and pulsed radiation fields present during accelerator operation. Its mass production was carried out almost entirely at CERN, thanks to interdepartmental collaboration with TE, BE, EN, IT and EP, and further supported by industrial partners from ten countries.
In the spirit of collaboration, knowledge exchange and technology transfer, the ambition to share the benefits of this REMS solution has been alive for some time, as explains Daniel Perrin, Leader of the Instrumentation and Logistics section in HSE-RP. “While many workshops, conferences and working groups exist in various fields such as electronics, software, radiation protection and dosimetry, we had nothing of this kind for REMS. With CROME’s achievement, we felt it was high time to create such a forum.”
This is how the first PulsRad workshop came about, to celebrate the success of the CROME project and to start building a community around REMS with a particular focus on pulsed radiation monitoring. The workshop took place from 5 to 7 December 2022 in the Globe of Science and Innovation, organised by the HSE-RP group at CERN and the European Spallation Source ERIC, with support from the ITER Organization and Fusion for Energy.
Hamza Boukabache, Electronics Engineer and CROME Project Manager in HSE-RP-IL, who led the organisation of this hybrid workshop, highlighted its pioneering nature: “In industry, knowledge about radiation measurements is often kept confidential within companies. As a scientific organisation, we have the possibility to share our knowledge and experience of REMS with other research facilities. The idea is to create a space for discussion and exchange for engineers and scientists developing, deploying and operating REMS, to identify common problems and create synergies between different organisations in order to devise solutions as well as provide an entry point for newcomers to REMS.”
And the objective was met. Some 75 participants joined physically and remotely from CERN and other large scientific facilities across the globe, including ITER, SLAC National Accelerator Laboratory, DESY, GSI Helmholtz Centre for Heavy Ion Research, CHUV – Lausanne university hospital, Paul Scherrer Institute and KEK, to name a few.
Alasdair Day, a former member of CERN’s HSE-RP-IL section and now Senior Engineer for Radiation Monitoring at the European Spallation Source, was a driving force in the inception and organisation of PulsRad22: “This workshop is a beginning, and I’m really excited to see where it goes in the future. For me, being able to give assistance to specialists during a breakout session, whilst being given insights from other experts during different discussions, clearly demonstrated the worth of this event. It was an opportunity for participants to share experiences while taking something back to their facilities and institutes. This collegial fraternity can help not only us as individuals, but also our respective organisations and, from that, science as a whole.”
This community has a bright future and the next event will take place in 2024, probably at ITER.
The reuse of equipment is inherent to the technological success that has always made and continues to make CERN what it is. The Proton Synchrotron is certainly the prime example of this – commissioned in 1959 and upgraded multiple times since, it is still tirelessly injecting protons into the accelerator chain. Another facility that fits perfectly into this paradigm is Building 180, CERN’s second-biggest building in terms of surface area (13 500m2) and home to the Large Magnet Facility (LMF), which has regularly been relocated over the years to adapt to the Organization’s evolving needs. The building got another makeover during the renovation work that was completed in 2022, readying the world’s biggest magnet factory to take on the challenges of the HL-LHC era and beyond.
Originally designed to house fixed-target experiments (most notably the Gargamelle bubble chamber), the facility was converted in the 1980s into the factory for magnets and detector components that we know today. It is now home to the LMF, where the LHC dipoles were assembled, and to several ATLAS workshops. Much of the equipment that criss-crosses the LMF Hall today – including its powerful presses – actually dates back to the facility’s origins in the 1960s. These machines have survived through the generations thanks to the efforts made to breathe new life into them:
“A group of young engineers with the freedom to exercise their creativity has been able to find new purposes for equipment that has only rarely been used since the decline of the fixed-target experiments. This new lease of life is all the more welcome because the equipment in question is rare and no longer manufactured. Some of the presses are now used to make components for the future HL-LHC – and they work very well!” says a delighted Rosario Principe, from the Accelerator Technology department.
A similar approach was taken to the year-long renovation of Building 180, which was completed in 2022. While the building’s envelope was being redone to improve its long-term sustainability, steps were taken to ensure that operations could continue throughout the renovation process. The main aim of the works, which also encompassed the adjacent Building 183, was to remove asbestos from the vast hall, improve its insulation and waterproofing, and modernise the façade, windows and roof. This was no mean feat given the sheer size of the building and the need for work to continue in the LMF and the ATLAS workshops, where the New Small Wheels, which went on to be installed in the detector during the renovation period, were being assembled.
“Initially, a lot of uncertainty surrounded the project, which we approached with a certain degree of caution. Carrying out such large-scale works on a building that is so vital for CERN was a major challenge, but we pulled it off,” says David Rodriguez, the construction project leader. The key to the success of the works was effective coordination between CERN, the building’s users and the six contractors making up the consortium carrying out the works.
“All the parties involved agreed that the bulk of the work would be performed from outside the building, via scaffolding erected against the façade. This allowed us to minimise our impact on the activities inside and to stay on good terms with the building’s users. We would particularly like to thank the building’s TSO, Rodrigue Faes, for facilitating communication among the teams,” adds Milton Morais, the construction manager.
The renovated building is not only safer and more elegant, but the new insulation significantly improves the working conditions of the users, who enjoy a warmer working environment while consuming less energy overall. This improvement in the building’s energy performance earned CERN a subsidy from Geneva’s cantonal energy office.
Following its metamorphosis, Building 180 is all set to remain at the heart of CERN’s activities for years, if not decades, to come.thortala Wed, 01/18/2023 - 11:06 Byline Thomas Hortala Publication Date Wed, 01/18/2023 - 11:01
Today, CERN celebrates the completion of the civil-engineering work for the High-Luminosity Large Hadron Collider (HL-LHC), the major upgrade of its flagship collider, the LHC.
Approved in June 2016 and due to start operating in 2029, the HL-LHC will considerably improve the performance of the LHC by increasing the number of particle collisions and thus boosting the potential for discoveries. The completion of the civil-engineering work marks the start of the transition towards the HL-LHC era; the new components for the collider will be installed in the caverns and galleries that are now ready.
The HL-LHC is CERN’s main scientific goal of the decade, recognised as one of the highest priorities for the field in the 2020 update of the European Strategy for Particle Physics. This major upgrade builds on the success of the LHC since it started operating in 2010. While the LHC is able to produce up to 1 billion proton–proton collisions per second, the HL-LHC will increase this number, known as “luminosity”, by a factor of between five and seven, allowing about ten times more data to be accumulated between 2029 and 2041, the period during which it will be operating.
To achieve this increase in luminosity, several innovative and challenging key technologies are being developed. These include new superconducting quadrupole magnets (based on niobium–tin instead of niobium–titanium) that will better focus the beam, and compact crab cavities to tilt the beams at the collision points thus maximising the overlap of protons. Other innovations include high-temperature superconducting links, new technologies for beam vacuum (prolonging the lifetime of the magnets) and beam collimation (protection from quenching), as well as very precise high-current power converters.
Most of these HL-LHC components will be integrated at Point 1 (Meyrin, Switzerland) and Point 5 (Cessy, France) of the LHC ring, where the high-luminosity detectors ATLAS and CMS are located.
“The civil-engineering work started in June 2018 and, despite the difficult global context, was successfully completed at the end of 2022. The technological developments are well advanced, so we really are at the start of the transition towards the HL-LHC era, one that will push the boundaries of technology and knowledge even further. It will allow physicists to study known mechanisms in greater detail, such as the Higgs boson, and observe rare new phenomena that might reveal themselves,” says Oliver Brüning, HL-LHC project leader.
The HL-LHC is an international endeavour involving 43 institutions in 19 countries, including in CERN’s Member and Associate Member States, as well as in the United States, Canada, Japan and China.
Only three years have passed since the last surveys for staff and fellows were launched, yet the world looks very different than it did in 2019. Our community, like society at large, is recovering from the effects of a long pandemic and is now dealing with the consequences of the military invasion of Ukraine and difficult economic conditions. On a brighter note, the transition from the second long shutdown (LS2) to Run 3 has been very successful, shifting resources across the Organization to focus on the operation of our flagship collider and the next large upgrade of our infrastructure for the High-Luminosity LHC in 2026–2028. These events have shaken up the way we work, warranting the launch of a new survey at CERN.
Surveys are just one of the tools the Organization’s Management uses to gather input from its community. By making your voice heard, you help identify what matters to you in your work environment, which, in turn, allows the Management to better understand potential areas for improvement and what motivates and engages the personnel.
This year’s survey comprises a condensed questionnaire, which should not take more than 15 minutes to complete, and aims to paint a comprehensive picture of your experience at CERN by focusing on four key themes: “my work”, “my management”, “the Organization” and “the future of CERN”. Staff members and fellows will be invited to share their feedback on these four themes, highlighting the Organization’s strengths as well as the areas where improvement is still needed.
The importance of this exercise cannot be overstated: the findings will be used to shape improvements and prioritise actions for the coming years. With that in mind, be on the lookout for the launch of the survey, which should reach your inbox by the end of the month and will also feature in the next issue of the CERN Bulletin.
We thank you in advance for your collaboration and look forward to your input.thortala Thu, 01/12/2023 - 14:34 Publication Date Thu, 01/12/2023 - 14:30
With more and more random software installed on our laptops, tablets, smartphones and other devices, more and more security risks creep in. Every piece of software is a security risk. Every piece of software comes naturally in an imperfect state, with human-implemented weaknesses and vulnerabilities to be discovered. Fortunately, many software vendors (but far from all) have put systems in place to fix discovered vulnerabilities and weaknesses as soon as possible. And with “auto-update” enabled, your device might just install that new version and keep you safe. Unfortunately, not every auto-update is so auto.
What is meant by “auto” can vary widely. Usually, it is expected that, with “auto-update”, new versions are discreetly installed in the background. In other cases, the update process might be verbose with pop-ups and message windows, or even require a reboot. But some “auto-updates” don’t even self-launch. They’re actually not that “auto” at all, but require you to take action – to take responsibility and get it going by scheduling and launching the update process yourself. And this is where the process fails. Lazy people as we are. And so, lazily, we put the security of our devices at risk.
We shouldn’t. Our digital life depends heavily on the security of our devices (see our Bulletin article on apartments). Just think of the mess you’d be in if a malicious, evil attacker got access to your device(s) – to your hard disk, documents, photos and files. To your camera and microphone. To your keyboard and the keys you type. Malicious access obtained. Data gone. Passwords gone. Privacy gone. Confidentiality gone. Your digital life gone. And with it your work, and the security of CERN. Terminated. Game over. Bye-bye.
For the sake of protecting our digital life – for the sake of protecting our Organization, too! – we should secure our devices as thoroughly as we can. We should ensure that our entire installed software stack is always up-to-date. We should ensure that “auto-update” really means “auto” and is configured to be “auto”. We should allow software demanding to be updated to launch its update process as soon as possible, whether immediately or overnight. And we should refrain from postponing updates forever. Ignoring them. Suppressing them. Because a missing update implies an unfixed weakness and vulnerability. Because a missing update poses a risk – to your digital life and to the Organization. Intervening manually to make “auto-update” really “auto” would reduce that risk. Thank you for securing your digital life. And CERN.
Do you want to learn more about computer security incidents and issues at CERN? Follow our Monthly Report. For further information, questions or help, check our website or contact us at Computer.Security@cern.ch.anschaef Tue, 01/10/2023 - 14:50 Byline Computer Security team Publication Date Tue, 01/10/2023 - 14:46
Staff members who marked 25 years of service to CERN in 2022 were invited by the Director-General to the traditional ceremony in their honour held on 22 November 2022.
The photos from the ceremony and the list of the 67 staff members concerned can be viewed in this album (restricted access).
We thank them all warmly for their commitment and wish them continued success at CERN!thortala Mon, 01/09/2023 - 17:19 Byline HR department Publication Date Mon, 01/09/2023 - 17:05
Marco van Leeuwen, senior scientist at Nikhef (Netherlands), has taken over from Luciano Musa as ALICE spokesperson as of early January 2023. He will lead the collaboration for the coming three years. Elected by the ALICE Collaboration Board, Marco comes to the position after serving as the upgrade coordinator for the last three years, and as ALICE physics coordinator prior to that. The new management team includes deputy spokespersons Kai Schweda, senior scientist at the GSI Helmholtz Centre in Darmstadt, Germany, and Bedangadas Mohanty, professor of physics at the National Institute of Science Education and Research in Bhubaneswar, India.
The new team is looking forward to collecting several large data samples with the upgraded ALICE detector during Run 3, including the first heavy-ion data taking of Run 3 later this year, as well as preparing for the ITS 3 and FoCal upgrades and the ALICE 3 programme.
Find out more about the new management on the ALICE website.thortala Mon, 01/09/2023 - 11:24 Publication Date Mon, 01/09/2023 - 11:23
As CERN celebrated ten years since the Higgs boson was discovered, experiments produced a wealth of physics results: from the discovery of new exotic particles to surprising antimatter behaviour and much more.
CERN strengthened international collaborations, bringing together experts to discuss quantum technologies and future technology for health, in a year when Brazil signed an agreement to become an Associate Member State. Additional knowledge-sharing initiatives included a world first in cancer radiotherapy, future clean aviation and even the first CERN-driven satellite being launched into space.
Watch this video and enjoy a visual journey through key moments of 2022!katebrad Wed, 12/21/2022 - 09:10 Publication Date Wed, 12/21/2022 - 10:58
Ever since the open access (OA) publication of peer-reviewed primary research articles from CERN authors was made a policy requirement in 2014, CERN has made great strides forward in opening its research to anyone around the world. This has been achieved thanks to a variety of mechanisms implemented by the CERN Scientific Information Service (SIS), ranging from a series of Read & Publish agreements signed with major publishers to CERN’s participation in the SCOAP3 consortium, which has arranged for automatic OA to research in high-energy physics (HEP).
Books (including monographs and textbooks) have often been left out of such agreements and schemes. However, more and more monographs are now being published OA, thanks in part to historical and recent initiatives supported by CERN. The latest of these initiatives, SCOAP3 for books, has made dozens of books available in OA since its inception in 2022.
CERN’s commitment to OA for books is nothing new: CERN authors have long benefitted from the Organization’s support to help them make their monographs and reports freely accessible to anyone. As a result, ever since the OA publication of the first Yellow Reports in 1955, many monographs by CERN authors have followed suit. CERN’s efforts in this direction have recently been completed by the Organization’s participation in the MIT’s Direct to Open programme, through which libraries around the world shift from buying monographs from the MIT Press to funding them for everyone.
On top of all that, SCOAP3 for books looks set to bring about an enduring change in the publishing landscape for books in HEP and related disciplines. The initiative, which represents an expansion of the regular activities of the CERN-coordinated SCOAP3, has so far made more than 60 academic books (including monographs and textbooks) available open access. Voluntary contributions from hundreds of SCOAP3 member institutions fund the programme, opening education and research in HEP to the world.
Books published in OA thanks to the SCOAP3 for books initiative can be accessed on the publishers’ websites and through a dedicated collection on the OAPEN Library.
Enjoy your reading!
For any questions, please contact email@example.com.
CERN Scientific Information Service
Today the international LHCb collaboration at the Large Hadron Collider (LHC) presented new measurements of rare particle transformations, or decays, that provide one of the highest-precision tests yet of a key property of the Standard Model of particle physics, known as lepton flavour universality. Previous studies of these decays had hinted at intriguing tensions with the theoretical predictions, potentially due to the effects of new particles or forces. The results of the improved and wider-reaching analysis based on the full LHC dataset collected by the experiment during Run 1 and Run 2, which were presented at a seminar at CERN held this morning, are in line with the Standard Model expectation.
A central mystery of particle physics is why the 12 elementary quarks and leptons are arranged in pairs across three generations that are identical in all but mass, with ordinary matter comprising particles from the first, lightest generation. Lepton flavour universality states that the fundamental forces are blind to the generation to which a lepton belongs. In recent years, however, an accumulation of results from LHCb and experiments in Japan and the US have suggested that this might not be the case, generating cautious excitement among physicists that a more fundamental theory – perhaps one that sheds light on the Standard Model’s mysterious flavour structure – might reveal itself at the LHC.
Interest in the “flavour anomalies” peaked in March 2021, when LHCb presented new results comparing the rates at which certain B mesons, composite particles that contain beauty quarks, decay into muons and electrons. According to the theory, decays involving muons and electrons should occur at the same rate, once differences in the leptons’ masses are accounted for. But the LHCb results hinted that B mesons decay into muons at a lower rate than predicted, as indicated by the results’ statistical significance of 3.1 standard deviations from the Standard Model prediction.
The new LHCb analysis, which has been ongoing for the past five years, is more comprehensive. It considers two different B-meson decay modes simultaneously for the first time and provides better control of the background processes that can mimic the decays of B-mesons to electrons. In addition, the two decay modes are measured in two different mass regions, thus yielding four independent comparisons of the decays. The results, which supersede previous comparisons, are in excellent agreement with the principle of lepton flavour universality.
“Measurements of the ratios of rare B-meson decays to electrons and muons have generated much interest in recent years because they are theoretically ‘clean’ and show consistency with a pattern of anomalies seen in other flavour processes,” explains LHCb spokesperson Chris Parkes of the University of Manchester and CERN. “The results shown today are the product of a comprehensive study of the two main modes using our full data sample and applying new, more robust techniques. These results are compatible with the expectation of our theory.”
New datasets will allow LHCb – one of the four large experiments at the LHC at CERN – to investigate lepton flavour universality further, in addition to conducting a wider research programme that includes studies of new hadrons, including the search for exotic tetraquarks and pentaquarks and investigation of the differences between matter and antimatter. An upgraded version of the experiment now in operation for LHC Run 3 will collect larger datasets that will allow even higher-precision tests of rare particle decays.
“Earlier LHCb indications of anomalies concerning lepton flavour universality triggered excitement,” says theoretical physicist Michelangelo Mangano of CERN. “That such anomalies could potentially have been real shows just how much remains unknown, since theoretical interpretations exposed a myriad of unanticipated possible phenomena. The latest LHCb findings take nothing away from our mission to push the boundary of our knowledge further, and the search for anomalies, guided by experimental hints, goes on!”
After over three years of upgrade and maintenance work, the Large Hadron Collider began its third period of operation (Run 3) in July 2022. Since then, the world’s most powerful particle accelerator has been colliding protons at a record-breaking energy of 13.6 TeV. The ATLAS collaboration has just released its first measurements of these record collisions, studying data collected in the first half of August 2022.
The researchers measured the rates of two well-known processes: the production of top-quark pairs and the production of a Z boson, which proceed through strong and electroweak interactions, respectively. The ratio of their cross sections is sensitive to the inner structure of the proton, and their measurement sets constraints on the relative probabilities that reactions are initiated by quarks and gluons.
These early measurements also validate the functionality of the ATLAS detector and its reconstruction software, which underwent many improvements in preparation for Run 3.
Physicists focused on Z-boson decays to electron and muon pairs, and on top-quark decays to a W boson and a jet – collimated sprays of particles – originating from a bottom quark. The W boson subsequently decays into one electron or muon and an invisible neutrino. As the analysis uses very early Run 3 data, physicists relied on preliminary calibrations of the leptons, jets and luminosity. These were derived promptly after the first data became available.
ATLAS measured a top-quark pair to Z boson production ratio that is consistent with the Standard Model prediction within the current experimental uncertainty of 4.7%.
The calibration and corresponding uncertainties will be improved as more data is processed. Future updates of the calibration will allow researchers to measure the cross sections with greater precision.
To validate their results, physicists performed a series of cross-checks. These included measuring the ratio of the cross section each time the LHC was injected with a new fill of protons for a data-taking run.
More analyses using the Run 3 data will follow, exploiting the unprecedented energies and the increased LHC data set.
Read more on the ATLAS website.kbernhar Thu, 12/15/2022 - 17:06 Byline ATLAS collaboration Publication Date Fri, 12/16/2022 - 17:05
The CMS experiment is one of the largest international scientific collaborations in history, with a broad programme of activities at the forefront of particle physics research. As of 5 December 2022, all of the proton-proton data collected by CMS during Run 1 of the Large Hadron Collider (LHC) is now available through the CERN Open Data Portal. This completes the process that started in 2014 with the experiment’s very first open data release in experimental particle physics.
Completing the delivery of its Run 1 data within 10 years reaffirms the CMS collaboration’s commitment to an open data policy. This policy embodies values laid down in the CERN Convention, which states that all research undertaken at the Laboratory must be open and available to everyone.
The newly released CMS data consists of 42 collision datasets, representing a total of 491 terabytes, taken in early and late 2012 towards the end of LHC Run 1. This data includes some of the original findings from CMS that were used to confirm the existence of the Higgs boson, which earned François Englert and Peter Higgs the 2013 Nobel Prize in Physics.
Included in the release are examples of code used to extract physics. This software has been successfully used to demonstrate the intricacies of experimental particle data taking in the CMS Open Data workshops held over the last three years. In addition, the CMS Open Data guide covers details of how physics objects can be accessed using this software, giving users the possibility to expand on this sample code for studies of their own interest.
Adaptable software samples are one of the most efficient ways of passing on the knowledge needed for research on the CMS data. “The software included in this release helps us preserve the huge efforts of the CMS Run 1 data analysts,” says Julie Hogan, one of the key contributors to the CMS Open Data workshops.
”The code samples are essential ingredients for any serious effort to use this data for research,” adds Edgar Carrera, the lead organiser of the latest workshops. “We therefore do our best to allow users of the data to follow the original CMS procedures as closely as possible.”
The preparations for the next CMS data releases are under way. The collaboration looks forward to providing additional heavy-ion open data from Run 1 and to proceeding with further Run-2 releases.
Find out more about the CMS Open Data workshops in the video here:ndinmore Wed, 12/14/2022 - 09:46 Byline CMS collaboration Publication Date Wed, 12/14/2022 - 11:53