19-21 October 2020
Auditorium Folco Portinari
Europe/Zurich timezone

Scientific Programme

  • Machine Learning in High Energy Physics

    High Energy Physics has been taking advantage of Machine Learning algorithms for more than thirty years to address problems such as pattern recognition for tracking, multivariate classification of collision events, rejection of background events at trigger level.

    More recently, representation learning has been explored for a wide range of applications including tracking, jet reconstruction and parametrization of the detector response. Since many modern machine learning algorithms are based on likelihood maximization, applications to data modelling are also investigated.

  • Machine Learning for Fundamental Physics

    The searches of Gravitational Waves is producing a large amount of data that can be effectively analyzed using deep neural networks. Classification of interaction events of neutrinos takes advantage of algorithms for image processing based on convolutional neural networks.

    The limited bandwidth to transfer data from on-satellite experiments on cosmic rays is an important motivation to enhance the trigger performance, possibly taking advantage of machine learning solutions.

  • Machine Learning for Medical Physics

    The digitization of the reports from clinical exams provides a large amount of data opening to the application of machine learning techniques to assist the clinicians in the definition of therapies.
    Convolutional Neural Networks are being explored for translating into learned high-level features TCs, PETs and NMRs. Generative Models are in study to map the results obtained with an instrument into the expected results with another, providing helpful support to clinicians.

  • Hardware and Infrastructure for large-scale Machine Learning

    The ongoing reduction of the cost of the processing power is much faster than the reduction in the cost of the storage, pushing towards real-time analysis approaches intended to reduce the amount of background or useless data stored on disk. As a consequence, the role of the distributed computing systems such as the LHC Computing Grid will have to combine the jobs to reconstruct and select the data with an increasing request of resources to interpret and model the data. The latter may profit of the ongoing evolution of the High-Performance solutions for data centers in quick development in the private sector, including the usage of hardware accelerators, GPUs and FPGAs.