Speaker
Description
The upcoming High Luminosity phase of the Large Hadron Collider requires significant advancements in real-time data processing to handle the increased event rates and maintain high-efficiency trigger decisions. In this work, we explore the acceleration of graph neural networks on field-programmable gate arrays for fast inference within future muon trigger pipelines with O(100) ns latencies. Graph-based architectures offer a natural way to represent and process detector hits while preserving spatial and topological information, making them particularly suitable for muon reconstruction in a noisy and sparse environment. This work contributes to the broader goal of integrating AI-driven solutions into high-energy physics trigger systems and represents a step forward in enabling hardware-optimised, graph-based inference for real-time event selection in experimental physics.