1–4 Nov 2022
Rutgers University
US/Eastern timezone

Point Cloud Generation using Transformer Encoders and Normalising Flows

1 Nov 2022, 16:30
20m
Multipurpose Room (aka Livingston Hall) (Rutgers University)

Multipurpose Room (aka Livingston Hall)

Rutgers University

Livingston Student Center

Speaker

Benno Kach (Deutsches Elektronen-Synchrotron (DE))

Description

Machine-learning-based data generation has become a major topic in particle physics, as the current Monte Carlo simulation approach is computationally challenging for future colliders, which will have a significantly higher luminosity. The generation of particles poses difficult problems similar as is the case for point clouds. We propose that a transformer setup is well fitted to this task. In this study, a novel refinement model is presented, which uses normalizing flows as a prior and then enhances the generated points using an adversarial setup with two Transformer encoder networks. Different training architectures and procedures were tested and compared on the jetnet datasets.

Primary author

Benno Kach (Deutsches Elektronen-Synchrotron (DE))

Co-authors

Dirk Kruecker (Deutsches Elektronen-Synchrotron (DESY)-Unknown-Unknown) Isabell Melzer-Pellmann (Deutsches Elektronen-Synchrotron (DE))

Presentation materials