Searching for rare physics processes requires a good understanding of the
backgrounds involved. This often requires large amounts of simulated data that
are computationally expensive to produce. The Belle II collaboration is planning
to collect 50 times the amount of data of its predecessor Belle. With the
increase in data volume the necessary volume of simulated data increases as
well. Due to aggressive event selections that enrich the signal processes of
interest, much of the simulated data is thrown away.
This talk presents a method for predicting which events will be thrown away
already after the computationally less expensive event generation step. This is
achieved using graph neural networks applied to the simulated event decay tree.
Only events selected by the neural network are passed to the resource intense
detector simulation and the reconstruction step. False negatives from this
selection can lead to biases in the distributions of observables for filtered
events. Possible ways to mitigate this are also discussed.