Speaker
Description
Contrastive learning (CL) has emerged as a powerful technique for constructing low-dimensional yet highly expressive representations of complex datasets, most notably images. Augmentation-based CL — a fully self-supervised strategy — has been the dominant paradigm in particle physics applications, encouraging a model to learn useful features from input data by promoting insensitivity to irrelevant features (e.g. rotations). In this talk, we present recent work applying the supervised contrastive learning (SCL) paradigm to learn low-dimensional embeddings of jets. SCL explicitly uses class labels in its training objective, making it a natural choice for particle physics where ML algorithms are typically trained on Monte Carlo simulations with unambiguous labels. We show that SCL learns well-structured embeddings for jets that can be used very effectively for downstream tasks such as anomaly detection or traditional supervised analysis. We also discuss preliminary work towards promoting domain adaptation capabilities in the embedding models, wherein the effects of known discrepancies between simulated training data and real LHC data are mitigated in the learned space.
| Would you like to be considered for an oral presentation? | Yes |
|---|