Speaker
Description
Machine learning has become increasingly popular in physics over the last decade. Recently, big efforts have been made to incorporate global and gauge symmetries in different neural network architectures. In this talk, I will focus on translational symmetry, which is an important idea behind Convolutional Neural Networks (CNNs). After explaining possible ways to ensure translational equivariance in a CNN, I will present two other architecture types that break translational equivariance at different places in the network. I will show how these architecture types perform on supervised machine learning tasks on a complex scalar field on the lattice, especially concerning their generalization ability to different physical parameters and lattice sizes.