Speaker
Description
The breakthroughs in computer vision and image recognition of the past decade using convolutional neural networks (CNNs) have shown that adapting neural network architectures to the symmetries associated with a particular machine learning problem leads to models that perform better and are easier to train and to interpret. These successes have led to applications in lattice gauge theory, such as detecting phase transitions or improving the performance of Monte Carlo methods. In this talk we present lattice gauge equivariant convolutional neural networks (L-CNNs) [1], a general framework for formulating neural networks that are equivariant under lattice gauge symmetry, and demonstrate that L-CNNs can outperform non-equivariant CNNs in non-linear regression tasks. Moreover, we prove that L-CNNs are able to generate arbitrarily shaped Wilson loops from just a few gauge equivariant network layers.
[1] M. Favoni, A. Ipp, D. I. Müller, D. Schuh, arXiv:2012.12901