25–29 Aug 2025
Monona Terrace
US/Central timezone

EveNet: Towards a Generalist Event Transformer for Unified Understanding and Generation of Collider Data

28 Aug 2025, 18:50
5m
Room MNQR (Monona Terrace)

Room MNQR

Monona Terrace

Computing AI / ML Poster

Speaker

Qibin Liu (SLAC National Accelerator Laboratory (US))

Description

With the increasing size of the machine learning (ML) model and vast datasets, the foundation model has transformed how we apply ML to solve real-world problems. Multimodal language models like chatGPT and Llama have expanded their capability to specialized tasks with common pre-train. Similarly, in high-energy physics (HEP), common tasks in the analysis face recurring challenges that demand scalable, data-driven solutions. In this talk, we present a foundation model for high-energy physics. Our model leverages extensive simulated datasets in pre-training to address common tasks across analyses, offering a unified starting point for specialized applications. We demonstrate the benefit of using such a pre-train model in improving search sensitivity, anomaly detection, event reconstruction, feature generation, and beyond. By harnessing the power of pre-trained models, we could push the boundaries of discovery with greater efficiency and insight.

Authors

Bai-Hong Zhou (Tsung-Dao Lee Institute (CN)) Ben Nachman (Lawrence Berkeley National Lab. (US)) Haoran Zhao (University of Washington (US)) Qibin Liu (SLAC National Accelerator Laboratory (US)) Shih-Chieh Hsu (University of Washington Seattle (US)) Shu Li (Tsung-Dao Lee Institute, Shanghai Jiao Tong Univ. (CN)) Ting-Hsiang Hsu (National Taiwan University (TW)) Vinicius Massami Mikuni (Lawrence Berkeley National Lab. (US)) Wei-Po Wang (University of Washington (US)) Yuan-Tang Chou (University of Washington (US)) Yue Xu (University of Washington (US)) Yulei Zhang (University of Washington (US))

Presentation materials