Particle Accelerators and Machine Learning

January 2, 2019
In Progress
Share this project


Particle accelerator facilities have a wide range of operational needs when it comes to tuning, optimisation, and control. At the Large Hadron Collider (LHC) at CERN reducing the risks related to the high beam power by reducing the beam losses will lead to increase in particle collision rates and a deeperunderstanding of the physics mechanisms. In order to meet these sorts of demands, particle accelerators rely on interactions with control systems, on fine-tuning of machine settings by operators, online optimisation routines, and on databases of previous settings that were known to be optimal for some desired operating condition. We aim to bring Machine Learning (ML) to particle accelerator operation, in order to increase the performance. Each of the mentioned operational needs have corresponding ML-based approaches that could be used to supplement the existing workflows. In addition, new HL-LHC and FCC designs will be proposed based on the LHC findings and prepare for more effective novel FCC operation.



SDSC Team:
Ekaterina Krymova
Guillaume Obozinski

PI | Partners:

Particle Accelerator Physics Laboratory:

  • Dr. Tatiana Pieloni
  • Dr. Michael Schenk
  • Loic Coyle



Minimise beam losses, better control of accelerator parameters, prevent unnecessary machine interruptions.


We are aiming to implement the paradigm of digital twins, i.e. a virtual representation of the real world accelerator. At the same time, this could create new virtual and augmented reality opportunities, which will certainly be a big theme in the implementation of the future FCC.

Proposed approach:

We propose to gather massive amount of accelerator data in collaboration with the LHC Operation groups to evaluate automatic and semi-automatic ways to optimise and steer the overall collider set-up and define the strategy for the operational aspects of the future projects (i.e. HL-LHC and FCC). In parallel to operational data accumulated during the physics runs, time will be devoted for machine development studies for testing the robustness of the developed models used for an automatised optimisation of the collider performances In dedicated experiments we will request the trained model to predict and set new parameters to improve the beam lifetimes in the LHC. Depending on the results obtained a continuation of the study and the extension of the models to other accelerators of the CERN complex and to future machine (HL-LHC) will be a natural path for a continuation of the collaboration.


Figure 1: LHC Fill, Beam Modes (from Wyszkowski, Przemysław Michał. ESB application for effective synchronization of large volume measurements data. Diss. AGH-UST, Cracow, 2011).
Figure 2: Schematic view of the LHC with two-beam design (from  Brüning O, Burkhardt H, Myers S. The large hadron collider. Progress in Particle and Nuclear Physics. 2012 Jul 1;67(3):705-34).


Additionnal resources


  1. G. Apollinari et al. (including T. Pieloni) “High-Luminosity Large Hadron Collider (HL- LHC): Preliminary Design Report – Chapter 2: Machine Layout and Performances” Preliminary Design Report
  2. L. Coyle, “Machine learning applications for hadron colliders: LHC lifetime optimization and designing Future Circular Colliders”, presented at the 2018 Swiss Physics Society Meeting at EPFL Annual meeting of the Swiss Physical Society 2018 2752252/SPS_talk.pdf


Related Pages

More projects


In Progress
Machine Learning for the Future Circular Collider Design
Big Science Data


In Progress
Real-time cleansing of snow and weather data for operational avalanche forecasting
Energy, Climate & Environment


AI-augmented architectural design
Energy, Climate & Environment


In Progress
Extracting activity from large 4D whole-brain image datasets
Biomedical Data Science


Latest news

Efficient and scalable graph generation through iterative local expansion
March 20, 2024

Efficient and scalable graph generation through iterative local expansion

Efficient and scalable graph generation through iterative local expansion

Have you ever considered the complexity of generating large-scale, intricate graphs akin to those that represent the vast relational structures of our world? Our research introduces a pioneering approach to graph generation that tackles the scalability and complexity of creating such expansive, real-world graphs.
RAvaFcast | Automating regional avalanche danger prediction in Switzerland
March 6, 2024

RAvaFcast | Automating regional avalanche danger prediction in Switzerland

RAvaFcast | Automating regional avalanche danger prediction in Switzerland

RAvaFcast is a data-driven model pipeline developed for automated regional avalanche danger forecasting in Switzerland. It combines a recently proposed classifier for avalanche danger prediction at weather stations with a spatial interpolation model and a novel aggregation strategy to estimate the danger levels in predefined wider warning regions, ultimately assembled as an avalanche bulletin.
PassGPT | Using language models to enhance password security
February 6, 2024

PassGPT | Using language models to enhance password security

PassGPT | Using language models to enhance password security

PassGPT is a Large Language Model for password generation trained on leaked passwords, which can outperform existing methods based on generative adversarial networks by guessing twice as many unseen passwords.

Contact us

Let’s talk Data Science

Do you need our services or expertise?
Contact us for your next Data Science project!