Speaker
Description
Simulated event samples from Monte-Carlo event generators (MCEGs) are a backbone of the LHC physics programme.
However, for Run III, and in particular for the HL-LHC era, computing budgets are becoming increasingly constrained, while at the same time the push to higher accuracies
is making event generation significantly more expensive.
Modern ML techniques can help with the effort of creating such costly samples in two ways.
One way is to use inference models to try to learn the event distribution of the entire MCEG toolchain, or parts of it, such that events can then be generated with those \emph{replacement models}
in a fraction of the time a full MCEG would require.
This ansatz is however intrinsically constrained by the available training data.
Another way, and this is the one discussed in this talk, is to keep the MCEG,
and to use ML \emph{assistant models} to increase the efficiency of certain performance bottlenecks.
One of those bottlenecks is the sampling of the high-dimensional phase space of complex processes,
for which a given distribution must be approximated as closely as possible.
This is indeed a very generic problem, such that methods can be explored that have been developed
in entirely different fields of physics or even outside of physics.
In this talk I will discuss the potential to increase the phase space sampling efficiency
using the methods of Neural Importance Sampling and Nested Sampling,
and of neural network surrogates of the integrand to increase the efficiency of event unweighting.
The application of these methods within the \textsc{Sherpa} generator framework is then reviewed.