Speaker
Description
Theory predictions for the LHC require precise numerical phase-space integration and generation of unweighted events. We combine machine-learned multi-channel weights with a normalizing flow for importance sampling, to improve classical methods for numerical integration. We develop an efficient bi-directional setup based on an invertible network, combining online and buffered training for potentially expensive integrands. We illustrate our method for the Drell-Yan process with an additional narrow resonance. In addition to these results from the paper “MadNIS - Neural Multi-Channel Importance Sampling”, MadNIS now interfaces to MadGraph to use its matrix elements and channel mappings. I will present preliminary results from our upcoming comparison between MadNIS and classical MadGraph for various LHC processes.