8โ€“12 Sept 2025
Hamburg, Germany
Europe/Berlin timezone

Efficient Point Transformer for Charge Particles Track Reconstruction

Not scheduled
30m
Hamburg, Germany

Hamburg, Germany

Poster Track 2: Data Analysis - Algorithms and Tools Poster session with coffee break

Speaker

Yuan-Tang Chou (University of Washington (US))

Description

Charge particle track reconstruction is the foundation of the collider experiments. Yet, it's also the most computationally expensive part of the particle reconstruction. The innovation in tracking reconstruction with graph neural networks (GNNs) has shown the promising capability to cope with the computing challenges posed by the High-Luminosity LHC (HL-LHC) with Machine learning. However, GNNs face limitations involving irregular computations and random memory access, slowing down their speed. In this talk, we introduce a Locality-Sensitive Hashing-Based Efficient Point Transformer (HEPT) with advanced attention methods as a superior alternative with near-linear complexity, achieving milliseconds latency and memory consumption. We present a comprehensive evaluation of HEPT's computational efficiency and physics performance compared to other algorithms, such as GNN-based pipelines, highlighting its potential to revolutionize full track reconstruction.

Significance

We expand on the ICML 2024 result to improve the attention algorithms and study the physics tracking performance compared with other existing methods.

References

Previous results presented at ICML 2024 at https://arxiv.org/abs/2402.12535

Authors

Advaith Anand (University of Washington (US)) Amit Saha (Georgia Institute of Technology) Jack Patrick Rodgers (Purdue University (US)) Miaoyuan Liu (Purdue University (US)) Pan Li Shih-Chieh Hsu (University of Washington Seattle (US)) Shitij Govil (Georgia Institute of Technology) Siqi Miao (Georgia Institute of Technology) Yuan-Tang Chou (University of Washington (US))

Presentation materials

There are no materials yet.