Pattern recognition is the fundament of the event reconstruction in high energy physics, but has also become - alongside with the detector simulation - one of the main computing intensive tasks.
For the Large Hadron Collider (LHC), the luminosity is expected to increase by a factor five by 2025, leading to proton collision with very large complexity. The current pattern recognition algorithms, which role is to reconstruct the individual particles from the energy deposition on the detectors, are stretching to their limits. A general effort is needed by putting together experts from the different experiments, as well as Data Science experts in novel optimisation algorithms, to try out completely different approaches and foster from recent advances in machine learning: Graph Networks, Recurrent Neural Networks, Convolutional Neural Networks, Monte Carlo Tree Search, Probabilistic hashing are methods which have been proposed to solve this problem, on the one hand, and further optimisation of current pattern recognition algorithms, such as concurrent algorithmic execution or optimisation for accelerated hardware are pursued on the other hand. Several programs exploiting machine learning techniques have been based lately within the context of the tracking machine learning challenge, in the field of quantum computing and carried out by HEP experiments in order to meet the requirements for future data taking campaigns at the LHC and other HEP experiments.
Link to Institut Pascal web page.