ML4FCC

Machine Learning for the Future Circular Collider Design

Started
October 18, 2022
Status
In Progress
Share this post

Abstract

In June 2020, the Council of the European Organization for Nuclear Research (CERN) updated the European strategy for particle physics to establish the electron-positron Higgs factory as the highest-priority facility after the Large Hadron Collider (LHC) era [1]. The strategy emphasizes the importance of ramping up the research and development for advanced accelerator and computing technologies to prepare for future collider facilities, such as the Future Circular Collider (FCC) [2].

The Swiss Accelerator Research and Technology Institute (CHART, [3,4]) appointed the Laboratory for Particle Accelerator Physics at EPFL [5] to spearhead the efforts for developing the framework and the particle beam dynamics tools required to design and optimize the FCC. While the FCC will undoubtedly lead to significant breakthroughs in high-energy particle physics, the design and operation of the machine as a precision instrument needs careful consideration and presents an exciting Big Data and Machine Learning (ML) opportunity.

People

Scientists

SDSC Team:
PI | Partners

CERN:

  • Dr. Pieloni Tatiana

More info

description

Introduction

Accelerator performance is characterized by the size of the area in phase space where the beam particles feature bounded, i.e. stable, dynamics under long-term tracking (~10 h, or 108 revolutions). The dynamics is defined by the static and variable non-linear electric and magnetic field elements that comprise the machine, also known as the lattice. Knowledge of the boundary between the stable and the chaotic regimes, called the Dynamic Aperture (DA), and maximization of the latter are vital during the design phase and eventually for successful machine operation since particles located in the unstable regime will be lost from the beam and reduce its lifetime. Such losses limit the collider’s luminosity reach and hence have a critical impact on its physics performance. The best way to assess the DA at present is by means of particle tracking simulations [6]. Typically, simulations can cover only up to 106 revolutions around the ring (1-minute in real time). While the physics of the involved processes is well understood, the mathematical formulation is a non-linear dynamics problem that is highly dependent on the initial conditions.

A key aspect of accelerator design is to maximize the stable area, or DA, while respecting a plethora of other objectives, by properly adjusting the 10’000s of electro-magnetic elements using a large number of high-level parameters. This requires efficient search algorithms of the available parameter space. The main objectives of this project are hence twofold: first, to identify and develop a suitable ML model of the stable phase space area as a function of the machine input parameters by training on a large number of tracking simulations; and second, to develop an active learning framework that provides a targeted, efficient, yet exhaustive search for optimal accelerator designs in terms of parameter settings and / or lattice configurations to potentially revolutionize accelerator design. Parallelization of the algorithms will be relevant to best use the available high-performance clusters at EPFL. Potentially, the DA ML model could also be employed to focus only on tracking the particles in relevant areas of the phase space. This would free up computational resources to extend the essential simulations beyond current limitations and improve the accuracy of the highly relevant particle beam lifetime estimates. In addition, we believe a paradigm shift is possible for this kind of simulation studies by exploring parameter space in a smart way and by avoiding redundant. The LHC also provides a large amount of experimental data that is currently processed within the PACMAN project [7] and will be valuable for comparison and evaluation of the accelerator design framework proposed here.

Problem:

  • A future hadron collider shall be designed for maximum performance,
  • Tolerances and specifications of magnets must be determined (cost impact),
  • Survival time of circulating particles as a function of deviation from beam centre is a key element,
  • Is computationally expensive: 10^6 turns, several 10k elements per turn.

Solution:

  • Determine stability of starting amplitudes,
  • Use machine learning to explore and interpolate the phase space (amplitudes) and accelerator parameter space (tuning),
  • ML will speed up process by orders of magnitude.

Impact:

  • Maximise performance of this expensive research infrastructure under planning,
  • Magnet specifications as best cost/benefit ratio.

Gallery

Annexe

Additionnal resources

Bibliography

  1. CERN Council, “2020 Update of the European Strategy for Particle Physics”, CERN- ESU-013, June 2020.
  2. FCC Conceptual Design Report, January 2019.
  3. Swiss Accelerator Research and Technology Institute (CHART).
  4. L. Rivkin, Swiss Accelerator Research & Technology, CHIPP Board Meeting, Bern (CH), January 2019.
  5. EPFL Laboratory for Particle Accelerator Physics (LPAP).
  6. Sixtrack single-particle simulation code.
  7. PACMAN project, 2019.

Publications

Related Pages

More projects

CLIMIS4AVAL

In Progress
Real-time cleansing of snow and weather data for operational avalanche forecasting
Energy, Climate & Environment

SEMIRAMIS

Completed
AI-augmented architectural design
Energy, Climate & Environment

4D-Brains

In Progress
Extracting activity from large 4D whole-brain image datasets
Biomedical Data Science

deepLNAfrica

In Progress
Deep statistical learning-based image analysis for measurement of socioeconomic development in sub-Saharan Africa using high-resolution satellite images, and geo-referenced household survey data
Energy, Climate & Environment

News

Latest news

Climate-smart agriculture in sub-Saharan Africa: optimizing nitrogen fertilization with data science
November 6, 2023

Climate-smart agriculture in sub-Saharan Africa: optimizing nitrogen fertilization with data science

Climate-smart agriculture in sub-Saharan Africa: optimizing nitrogen fertilization with data science

Food insecurity in sub-Saharan Africa is widespread, with crop yields much lower than in many developed regions. The project aims to use laser spectroscopy to measure fluxes and isotopic composition of N2O from maize and potato crops subjected to a range of fertilization levels.
Street2Vec | Self-supervised learning unveils change in urban housing from street-level images
October 31, 2023

Street2Vec | Self-supervised learning unveils change in urban housing from street-level images

Street2Vec | Self-supervised learning unveils change in urban housing from street-level images

It is difficult to effectively monitor and track progress in urban housing. We attempt to overcome these limitations by utilizing self-supervised learning with over 15 million street-level images taken between 2008 and 2021 to measure change in London.
DLBIRHOUI | Deep Learning Based Image Reconstruction for Hybrid Optoacoustic and Ultrasound Imaging
February 28, 2023

DLBIRHOUI | Deep Learning Based Image Reconstruction for Hybrid Optoacoustic and Ultrasound Imaging

DLBIRHOUI | Deep Learning Based Image Reconstruction for Hybrid Optoacoustic and Ultrasound Imaging

Optoacoustic imaging is a new, real-time feedback and non-invasive imaging tool with increasing application in clinical and pre-clinical settings. The DLBIRHOUI project tackles some of the major challenges in optoacoustic imaging to facilitate faster adoption of this technology for clinical use.

Contact us

Let’s talk Data Science

Do you need our services or expertise?
Contact us for your next Data Science project!