25–29 May 2026
Chulalongkorn University
Asia/Bangkok timezone

Efficient Graph Segmentation via Global Path Inference and Learned Edge Embeddings for Scalable GNN-based Tracking

25 May 2026, 17:45
18m
Chulalongkorn University

Chulalongkorn University

Oral Presentation Track 3 - Offline data processing Track 3 - Offline data processing

Speaker

Jay Chan (Lawrence Berkeley National Lab. (US))

Description

Graph Neural Networks (GNNs) are a leading approach for particle track reconstruction, typically following a three-step pipeline: graph construction, edge classification, and graph segmentation. In edge-classification pipelines like ACORN, the segmentation step is often a trade-off between the speed of local algorithms (e.g., Junction Removal) and the accuracy of global algorithms (e.g., Walkthrough). While the latter achieves superior physics performance by evaluating global path features, its combinatorial complexity limits scalability.

In this talk, we introduce two novel methods to bridge this gap. First, we propose “D-WALK” (Dynamic programming-based WALKthrough), an algorithm that enables access to global path features without the exhaustive combinatorial search, significantly improving execution timing. Notably, D-WALK extends the capabilities of traditional walkthroughs by enabling efficient path comparisons in both directions, resolving ambiguities at junctions with both multiple incoming and multiple outgoing edges. Second, we introduce “Junction Resolution,” an extension to local segmentation that resolves graph ambiguities using learned edge embeddings from a modified GNN. We demonstrate that Junction Resolution achieves physics performance comparable to or better than global walkthroughs while maintaining the efficiency of local methods. Using the newly released ColliderML dataset, we show that these methods offer a scalable path forward for GNN-based tracking with substantial gains in both physics accuracy and computational throughput.

Author

Jay Chan (Lawrence Berkeley National Lab. (US))

Presentation materials

There are no materials yet.