6–8 Jul 2021
Europe/Zurich timezone

Lorentz Group Equivariant Autoencoder

6 Jul 2021, 21:00
20m

Speaker

Zichun Hao (Univ. of California San Diego (US))

Description

Symmetries are ubiquitous and essential in physics, and the framework to describe symmetries is group theory. The symmetry described by the Lorentz group is essential in the dynamics of all particle physics experiments. A Lorentz-group-equivariant deep neural network framework, called the Lorentz group network (LGN), has been introduced by Bogatskiy et al. and tested for performance in classifying jets. The model uses irreducible representations of the Lorentz group to achieve equivariance with respect to Lorentz transformations. However, the architecture has not yet been extended to generative, compression, or anomaly detection tasks yet. We develop an autoencoder based on the architecture of the LGN for jet compression and reconstruction tasks, using a complex permutation invariant loss function. The model is tested to be fully equivariant (within numerical precision) and is trained on a dataset of high momentum jets simulated at the LHC. We analyze the latent space after training, and explore how the choices of hyperparameters, such as the multiplicities of scalars and vectors in the latent spaces and the number of basis functions for the edge features, can influence the model’s performance.

Affiliation Department of Physics, University of California, San Diego
Academic Rank Undergraduate student, Postdoctoral researcher, Professor, PhD student, PhD student

Primary authors

Zichun Hao (Univ. of California San Diego (US)) Daniel Diaz (Univ. of California San Diego (US)) Javier Mauricio Duarte (Univ. of California San Diego (US)) Raghav Kansal (Univ. of California San Diego (US)) Farouk Mokhtar (Univ. of California San Diego (US))

Presentation materials