25–28 Sept 2023
Imperial College London
Europe/London timezone

Efficient and Robust Jet Tagging at the LHC with Knowledge Distillation

25 Sept 2023, 17:45
5m
Blackett Laboratory, Lecture Theatre 1 (Imperial College London)

Blackett Laboratory, Lecture Theatre 1

Imperial College London

Blackett Laboratory
Lightning Talk Contributed Talks Contributed Talks

Speaker

Mr Ryan Liu (University of California, Berkeley)

Description

The challenging environment of real-time systems at the Large Hadron Collider (LHC) strictly limits the computational complexity of algorithms that can be deployed. For deep learning models, this implies only smaller models that have lower capacity and weaker inductive bias are feasible. To address this issue, we utilize knowledge distillation to leverage both the performance of large models and the speed of small models. In this paper, we present an implementation of knowledge distillation for jet tagging, demonstrating an overall boost in student models' jet tagging performance. Furthermore, by using a teacher model with a strong inductive bias of Lorentz symmetry, we show that we can induce the same bias in the student model which leads to better robustness against arbitrary Lorentz boost.

Authors

Abhijith Gandrakota (Fermi National Accelerator Lab. (US)) Jennifer Ngadiuba (FNAL) Mr Ryan Liu (University of California, Berkeley)

Co-authors

Dr Jean-Roch Vlimant (California Institute of Technology (US)) Prof. Maria Spiropulu (California Institute of Technology)

Presentation materials