Conveners
Generative: Partons and Phase Space
- Ramon Winterhalder (UCLouvain)
Transformers have become the primary architecture for natural language processing. In this study, we explore their use for auto-regressive density estimation in high-energy jet physics. We draw an analogy between sentences and words in natural language and jets and their constituents. Specifically, we investigate density estimation for light QCD jets and hadronically decaying boosted top jets....
In High Energy Physics, generating physically meaningful parton configurations from a collision reconstructed within a detector is a critical step for many complex analysis tasks such as the Matrix Element Method computation and Bayesian inference on parameters of interest. This contribution introduces a novel approach that employs generative machine learning architectures, Transformers...
The matrix element method remains a crucial tool for LHC inference in scenarios with limited event data. We enhance our neural network-based framework, now dubbed MEMeNNto, by optimizing phase-space integration techniques and introducing an acceptance function. Additionally, employing new architectures, like transformer and diffusion models, allows us to better handle complex jet combinatorics...
Theory predictions for the LHC require precise numerical phase-space integration and generation of unweighted events. We combine machine-learned multi-channel weights with a normalizing flow for importance sampling to improve classical methods for numerical integration. By integrating buffered training for potentially expensive integrands, VEGAS initialization, symmetry-aware channels, and...