
ML4FCC
Machine Learning for the Future Circular Collider Design

Abstract
In June 2020, the Council of the European Organization for Nuclear Research (CERN) updated the European strategy for particle physics to establish the electron-positron Higgs factory as the highest-priority facility after the Large Hadron Collider (LHC) era [1]. The strategy emphasizes the importance of ramping up the research and development for advanced accelerator and computing technologies to prepare for future collider facilities, such as the Future Circular Collider (FCC) [2].
The Swiss Accelerator Research and Technology Institute (CHART, [3,4]) appointed the Laboratory for Particle Accelerator Physics at EPFL [5] to spearhead the efforts for developing the framework and the particle beam dynamics tools required to design and optimize the FCC. While the FCC will undoubtedly lead to significant breakthroughs in high-energy particle physics, the design and operation of the machine as a precision instrument needs careful consideration and presents an exciting Big Data and Machine Learning (ML) opportunity.
Presentation
People
Scientists


Guillaume Obozinski graduated with a PhD in Statistics from UC Berkeley in 2009. He did his postdoc and held until 2012 a researcher position in the Willow and Sierra teams at INRIA and Ecole Normale Supérieure in Paris. He was then Research Faculty at Ecole des Ponts ParisTech until 2018. Guillaume has broad interests in statistics and machine learning and worked over time on sparse modeling, optimization for large scale learning, graphical models, relational learning and semantic embeddings, with applications in various domains from computational biology to computer vision.


Ekaterina received her PhD in Computer Science from Moscow Institute for Physics and Technology, Russia. Afterwards, she worked as a researcher at the Institute for Information Transmission Problems in Moscow and later as a postdoctoral researcher in the Stochastic Group at the Faculty of Mathematics at University Duisburg-Essen, Germany. She has experience with various applied projects on signal processing, predictive modelling, macroeconomic modelling and forecasting, and social network analysis. She joined the SDSC in November 2019. Her interests include machine learning, non-parametric statistical estimation, structural adaptive inference, and Bayesian modelling.


Yousra studied Mathematics and Computational Statistics. She did a PhD in Statistical Learning followed by a post-doc both at EPFL, where she developed empirical Bayes methods for automatic L2 regularization problems in smooth regression for big data. Before joining SDSC, she worked at SIB/UNIL, where she developed a statistical optimization solution to the parent-of-origin identification problem in human genetics. Her technical expertise includes supervised learning, numerical optimization for machine learning, statistical modeling and methodology, high-performance and distributed computing for big data, Bayesian computation and time series analysis. She worked on applied problems in quantitative finance and environmental sciences.
description
Introduction
Accelerator performance is characterized by the size of the area in phase space where the beam particles feature bounded, i.e. stable, dynamics under long-term tracking (~10 h, or 108 revolutions). The dynamics is defined by the static and variable non-linear electric and magnetic field elements that comprise the machine, also known as the lattice. Knowledge of the boundary between the stable and the chaotic regimes, called the Dynamic Aperture (DA), and maximization of the latter are vital during the design phase and eventually for successful machine operation since particles located in the unstable regime will be lost from the beam and reduce its lifetime. Such losses limit the collider’s luminosity reach and hence have a critical impact on its physics performance. The best way to assess the DA at present is by means of particle tracking simulations [6]. Typically, simulations can cover only up to 106 revolutions around the ring (1-minute in real time). While the physics of the involved processes is well understood, the mathematical formulation is a non-linear dynamics problem that is highly dependent on the initial conditions.
A key aspect of accelerator design is to maximize the stable area, or DA, while respecting a plethora of other objectives, by properly adjusting the 10’000s of electro-magnetic elements using a large number of high-level parameters. This requires efficient search algorithms of the available parameter space. The main objectives of this project are hence twofold: first, to identify and develop a suitable ML model of the stable phase space area as a function of the machine input parameters by training on a large number of tracking simulations; and second, to develop an active learning framework that provides a targeted, efficient, yet exhaustive search for optimal accelerator designs in terms of parameter settings and / or lattice configurations to potentially revolutionize accelerator design. Parallelization of the algorithms will be relevant to best use the available high-performance clusters at EPFL. Potentially, the DA ML model could also be employed to focus only on tracking the particles in relevant areas of the phase space. This would free up computational resources to extend the essential simulations beyond current limitations and improve the accuracy of the highly relevant particle beam lifetime estimates. In addition, we believe a paradigm shift is possible for this kind of simulation studies by exploring parameter space in a smart way and by avoiding redundant. The LHC also provides a large amount of experimental data that is currently processed within the PACMAN project [7] and will be valuable for comparison and evaluation of the accelerator design framework proposed here.
Problem:
- A future hadron collider shall be designed for maximum performance,
- Tolerances and specifications of magnets must be determined (cost impact),
- Survival time of circulating particles as a function of deviation from beam centre is a key element,
- Is computationally expensive: 10^6 turns, several 10k elements per turn.
Solution:
- Determine stability of starting amplitudes,
- Use machine learning to explore and interpolate the phase space (amplitudes) and accelerator parameter space (tuning),
- ML will speed up process by orders of magnitude.
Impact:
- Maximise performance of this expensive research infrastructure under planning,
- Magnet specifications as best cost/benefit ratio.
Gallery
Annexe
Additionnal resources
Bibliography
- CERN Council, “2020 Update of the European Strategy for Particle Physics”, CERN- ESU-013, June 2020.
- FCC Conceptual Design Report, January 2019.
- Swiss Accelerator Research and Technology Institute (CHART).
- L. Rivkin, Swiss Accelerator Research & Technology, CHIPP Board Meeting, Bern (CH), January 2019.
- EPFL Laboratory for Particle Accelerator Physics (LPAP).
- Sixtrack single-particle simulation code.
- PACMAN project, 2019.
Publications
Related Pages
More projects
CLIMIS4AVAL
4D-Brains
News
Latest news


Climate-smart agriculture in sub-Saharan Africa: optimizing nitrogen fertilization with data science
Climate-smart agriculture in sub-Saharan Africa: optimizing nitrogen fertilization with data science


Street2Vec | Self-supervised learning unveils change in urban housing from street-level images
Street2Vec | Self-supervised learning unveils change in urban housing from street-level images


DLBIRHOUI | Deep Learning Based Image Reconstruction for Hybrid Optoacoustic and Ultrasound Imaging
DLBIRHOUI | Deep Learning Based Image Reconstruction for Hybrid Optoacoustic and Ultrasound Imaging
Contact us
Let’s talk Data Science
Do you need our services or expertise?
Contact us for your next Data Science project!