Speaker
Description
This study aims to use Machine Learning techniques to build models of the evolution of proton beam losses in the Large Hadron Collider for different operational scenarios. The models are trained on LHC 2017 operational data using a Random Forest supervised learning algorithm. From the models and covariance calculations we extract the parameters most contributing to the beam intensity lifetimes.
In parallel, a similar method has been applied to simulated particle losses. The goal is to train a model on a dataset produced by simulations. The aim is to understand the losses dependency on machine settings to explain the experimental observations, the model will be used to predict the beam lifetimes for the FCC study avoiding computationally expensive simulation campaigns.