Speaker
Timo Janssen
Description
Generative models can speed up parton-level Monte Carlo event generation. Normalizing Flows are especially interesting due to their exact likelihood evaluation. Compared to discrete, layer-based flows, continuous Normalizing Flows (CNFs) have been shown to offer higher expressivity. New simulation-free training methods reduce their training costs significantly. We show that CNFs trained by Flow Matching can improve the sampling of parton-level QCD scattering events compared to traditional methods such as Vegas, both in terms of Monte Carlo variance and unweighting efficiency. We evaluate their performance for relevant LHC processes and compare it to discrete flows and Vegas.