6–10 Nov 2023
DESY
Europe/Zurich timezone

Scalar Field Theories via Neural Networks at Initialization

9 Nov 2023, 17:25
1m
Anywhere

Anywhere

Speaker

Anindita Maiti (Perimeter Institute for Theoretical Physics)

Description

Neural Networks (NN), the backbones of Deep Learning, create field theories through their output ensembles at initialization. Certain limits of NN architecture give rise to free field theories via Central Limit Theorem (CLT), whereas other regimes give rise to weakly coupled, and non-perturbative field theories, via small, and large deviations from CLT. I will present a systematic construction of free, weakly interacting, and non-perturbative field theories by tuning different attributes of NN architectures, bringing in methods from statistical physics, and a new set of Feynman rules. Some interacting field theories of our choice can be exactly engineered at initialization, by parametrically deforming distributions of stochastic variables in NN architectures. As an example, I will present the construction of $λφ^4$ scalar field theory via statistical independence breaking of NN parameters in the infinite width limit.

Author

Anindita Maiti (Perimeter Institute for Theoretical Physics)

Co-authors

Prof. James Halverson (Northeastern University) Keegan Stoner Matthew Schwartz Dr Mehmet Demirtas (Northeastern University)

Presentation materials