ML4FCC

Machine Learning for the Future Circular Collider Design

Started
October 18, 2022
Status
In Progress
Share this project

Abstract

In June 2020, the Council of the European Organization for Nuclear Research (CERN) updated the European strategy for particle physics to establish the electron-positron Higgs factory as the highest-priority facility after the Large Hadron Collider (LHC) era [1]. The strategy emphasizes the importance of ramping up the research and development for advanced accelerator and computing technologies to prepare for future collider facilities, such as the Future Circular Collider (FCC) [2].

The Swiss Accelerator Research and Technology Institute (CHART, [3,4]) appointed the Laboratory for Particle Accelerator Physics at EPFL [5] to spearhead the efforts for developing the framework and the particle beam dynamics tools required to design and optimize the FCC. While the FCC will undoubtedly lead to significant breakthroughs in high-energy particle physics, the design and operation of the machine as a precision instrument needs careful consideration and presents an exciting Big Data and Machine Learning (ML) opportunity.

People

Collaborators

SDSC Team:
Guillaume Obozinski
Yousra El-Bachir
Ekaterina Krymova

PI | Partners:

Paul Scherrer Institute:

  • Prof. Mike Seidel

More info

CERN:

  • Dr. Pieloni Tatiana

More info

description

Introduction

Accelerator performance is characterized by the size of the area in phase space where the beam particles feature bounded, i.e. stable, dynamics under long-term tracking (~10 h, or 108 revolutions). The dynamics is defined by the static and variable non-linear electric and magnetic field elements that comprise the machine, also known as the lattice. Knowledge of the boundary between the stable and the chaotic regimes, called the Dynamic Aperture (DA), and maximization of the latter are vital during the design phase and eventually for successful machine operation since particles located in the unstable regime will be lost from the beam and reduce its lifetime. Such losses limit the collider’s luminosity reach and hence have a critical impact on its physics performance. The best way to assess the DA at present is by means of particle tracking simulations [6]. Typically, simulations can cover only up to 106 revolutions around the ring (1-minute in real time). While the physics of the involved processes is well understood, the mathematical formulation is a non-linear dynamics problem that is highly dependent on the initial conditions.

A key aspect of accelerator design is to maximize the stable area, or DA, while respecting a plethora of other objectives, by properly adjusting the 10’000s of electro-magnetic elements using a large number of high-level parameters. This requires efficient search algorithms of the available parameter space. The main objectives of this project are hence twofold: first, to identify and develop a suitable ML model of the stable phase space area as a function of the machine input parameters by training on a large number of tracking simulations; and second, to develop an active learning framework that provides a targeted, efficient, yet exhaustive search for optimal accelerator designs in terms of parameter settings and / or lattice configurations to potentially revolutionize accelerator design. Parallelization of the algorithms will be relevant to best use the available high-performance clusters at EPFL. Potentially, the DA ML model could also be employed to focus only on tracking the particles in relevant areas of the phase space. This would free up computational resources to extend the essential simulations beyond current limitations and improve the accuracy of the highly relevant particle beam lifetime estimates. In addition, we believe a paradigm shift is possible for this kind of simulation studies by exploring parameter space in a smart way and by avoiding redundant. The LHC also provides a large amount of experimental data that is currently processed within the PACMAN project [7] and will be valuable for comparison and evaluation of the accelerator design framework proposed here.

Problem

  • A future hadron collider shall be designed for maximum performance,
  • Tolerances and specifications of magnets must be determined (cost impact),
  • Survival time of circulating particles as a function of deviation from beam centre is a key element,
  • Is computationally expensive: 10^6 turns, several 10k elements per turn.

Solution

  • Determine stability of starting amplitudes,
  • Use machine learning to explore and interpolate the phase space (amplitudes) and accelerator parameter space (tuning),
  • ML will speed up process by orders of magnitude.

Impact

  • Maximise performance of this expensive research infrastructure under planning,
  • Magnet specifications as best cost/benefit ratio.

Gallery

Annexe

Additionnal resources

Bibliography

  1. CERN Council, “2020 Update of the European Strategy for Particle Physics”, CERN- ESU-013, June 2020.
  2. FCC Conceptual Design Report, January 2019.
  3. Swiss Accelerator Research and Technology Institute (CHART).
  4. L. Rivkin, Swiss Accelerator Research & Technology, CHIPP Board Meeting, Bern (CH), January 2019.
  5. EPFL Laboratory for Particle Accelerator Physics (LPAP).
  6. Sixtrack single-particle simulation code.
  7. PACMAN project, 2019.

Publications

Related Pages

More projects

CLIMIS4AVAL

In Progress
Real-time cleansing of snow and weather data for operational avalanche forecasting
Energy, Climate & Environment

SEMIRAMIS

Completed
AI-augmented architectural design
Energy, Climate & Environment

4D-Brains

In Progress
Extracting activity from large 4D whole-brain image datasets
Biomedical Data Science

deepLNAfrica

In Progress
Deep statistical learning-based image analysis for measurement of socioeconomic development in sub-Saharan Africa using high-resolution satellite images, and geo-referenced household survey data
Energy, Climate & Environment

News

Latest news

PassGPT | Using language models to enhance password security
February 6, 2024

PassGPT | Using language models to enhance password security

PassGPT | Using language models to enhance password security

PassGPT is a Large Language Model for password generation trained on leaked passwords, which can outperform existing methods based on generative adversarial networks by guessing twice as many unseen passwords.
ADORE | A benchmark dataset in ecotoxicology to foster the adoption of machine learning
January 24, 2024

ADORE | A benchmark dataset in ecotoxicology to foster the adoption of machine learning

ADORE | A benchmark dataset in ecotoxicology to foster the adoption of machine learning

Applying machine learning to ecotoxicology could help reduce the number of animal tests, costs, and animals sacrificed while preserving the accuracy of the in vivo tests.
License Flowers | Art and AI at SDSC
February 21, 2024

License Flowers | Art and AI at SDSC

License Flowers | Art and AI at SDSC

An adventure to create art using AI to raise awareness on code licenses

Contact us

Let’s talk Data Science

Do you need our services or expertise?
Contact us for your next Data Science project!