Transfer Learning in Deep Neural Networks across LHC Experiments

Speaker

Bernard Brickwedde (Johannes Gutenberg Universitaet Mainz (DE))

Description

In recent years, deep learning techniques became highly popular in high energy physics experiments as they deliver very promising results not only in the context of signal classification, but also in the context of object reconstruction. The involved neural network architectures are typically rather complex and require therefore significant resources during the training phase. One possibility to reduce the training time and training resources is the usage of transfer learning techniques. In this work, we present a first study of the possibility to transfer trained networks with experimental data of one LHC experiment to another, focussing on the expected savings in training time as well as the associated effects on its performance. Similarly, we study the possibility to transfer trained networks based on fast simulations to fully simulated data. The latter would allow that scientists outside of the LHC collaborations could develop complex tools, which could easily be adapted and used at the LHC experiments.

Authors

Bernard Brickwedde (Johannes Gutenberg Universitaet Mainz (DE)) Matthias Schott (CERN / University of Mainz)

Presentation materials

There are no materials yet.