19–25 Oct 2024
Europe/Zurich timezone

Transfer learning for Smart Background Simulation at Belle II

22 Oct 2024, 15:00
18m
Large Hall A

Large Hall A

Talk Track 5 - Simulation and analysis tools Parallel (Track 5)

Speaker

Nikolai Hartmann (Ludwig Maximilians Universitat (DE))

Description

Accurate modeling of backgrounds for the development of analyses requires large enough simulated samples of background data. When searching for rare processes, a large fraction of these expensively produced samples is discarded by the analysis criteria that try to isolate the rare events. At the Belle II experiment, the event generation stage takes only a small fraction of the computational cost of the whole simulation chain, motivating filters for the simulation at this stage. Deep neural network architectures based on graph neural networks have been proven useful to predict approximately which events will be kept after the filter, even in cases where there is no simple correlation between generator and reconstruction level quantities. However, training these models requires large training data sets, which are hard to obtain for filters with very low efficiencies. In this presentation we show how a generic model, pre-trained on filters with high efficiencies can be fine-tuned to also predict filters where only little training data is available. This also opens opportunities for online learning during the simulation process where no separate training step is required.

Primary author

Nikolai Hartmann (Ludwig Maximilians Universitat (DE))

Co-authors

Boyang Yu Daniel Pollmann (Ludwig Maximilians Universität (DE)) Thomas Kuhr (Ludwig Maximilians Universitat (DE))

Presentation materials