The CERN Quantum Technology Initiative (QTI) was launched in 2020 with the aim of investigating the role that quantum technologies could have within the High Energy Physics (HEP) research program. During this initial exploratory phase a set of results were gathered, outlining benefits, constraints and limitations of introducing technologies in different HEP domains, from advanced sensor for...
Reconstructing the trajectories of charged particles as they traverse several detector layers is a key ingredient for event reconstruction at LHC and virtually any particle physics experiment. The limited bandwidth available, together with the high rate of tracks per second O($10^{10}$) - where each track consists of a variable number of measurements - makes this problem exceptionally...
Graph Neural Networks (GNNs) have emerged as powerful tools for particle tracking in High Energy Physics (HEP), effectively modeling the complex relational structure of detector hits. Recent progress in quantum computing raises the possibility that quantum circuits, leveraging entanglement and superposition, could enhance GNNs by capturing intricate patterns in tracking data. However, the...
Quantum computing is arguably the only viable approach to simulating dynamical phenomena of general quantum systems. Inspired by the rapid advancement of quantum computing capabilities in recent years, there is a highly active community of theoretical and experimental high-energy physicists who seek to understand which HEP observables can be calculated through quantum simulations and how such...
Quantum computing technologies are emerging as promising tools to enhance computational speed and reduce resource demands in high-energy accelerator experiments. In particular, GPU-based quantum annealing simulations inspired by quantum annealing principles have reached a practical level of hardware development. They are now being explored for their potential in track reconstruction tasks.
In...
We propose a quantum protocol for efficiently learning and sampling multivariate probability distributions that commonly appear in high-energy physics. Our approach introduces a bivariate probabilistic model based on generalized Chebyshev polynomials, which is (pre-)trained as an explicit circuit-based model for two correlated variables, and sampled efficiently with the use of quantum...
Building on the success of TrackingBERT (arXiv:2402.10239) and TrackingSorter (arXiv:2407.21290) , we propose a unified approach to address the track finding problem and extend beyond it. Our method integrates the latent representation of detector hits learned from TrackBERT as additional feastures in the TrackSorter algorithm.Furthermore, we replace the detector module ID-based tokenization...
Hybrid pixel detectors such as Timepix3 and Timepix4 enable pixel-level resolution of individual particle interactions, where each event manifests as a cluster or track spanning multiple pixels. Analyzing these clusters allows for the estimation of key particle parameters, including type, initial energy, and angle of incidence. However, such ground-truth parameters are typically unavailable...
The High Luminosity Large Hadron Collider (HL-LHC), scheduled to begin operation after 2030, will increase the number of proton-proton collisions per event from approximately 60 to up to 200. This rise in interaction density will substantially elevate the occupancy within the ATLAS Muon Spectrometer, necessitating more efficient and robust real-time data processing strategies for the Event...
The implementation of neural networks and artificial intelligence on hardware accelerators is a key element for the development of the trigger strategy of future high-energy physics detectors. In parallel, the development of hardware technologies capable of maximizing AI performance and adapting to its computational needs is crucial, particularly in scenarios requiring the efficient processing...
Simulating physics processes and detector responses is essential in high energy physics but accounts for significant computing costs. Generative machine learning has been demonstrated to be potentially powerful in accelerating simulations, outperforming traditional fast simulation methods. While efforts have focused primarily on calorimeters initial studies have also been performed on silicon...
Electron-positron Higgs factories, such as ILC and FCCee, are the next-generation collider projects to reveal properties of Higgs and other particles in great details to observe BSM effects. Proposed detectors are designed to maximize obtained information from incident particles with highly granular detector elements. DNN-based reconstruction algorithms are critical to derive properties of...
Fast machine learning (ML) inference is of great interest in the HEP community, especially in low-latency environments like triggering. Faster inference often unlocks the use of more complex ML models that improve physics performance, while also enhancing productivity and sustainability. Logic gate networks (LGNs) currently achieve some of the fastest inference times for standard image...
The ATLAS-internally-know G-200 pipeline represents a major milestone in the evolution of online track reconstruction for the experiment, enabling the entire reconstruction chain at trigger level to be executed on GPU architectures. G-200 is the ATLAS ITk-specific implementation of the Traccc framework, which is part of ACTS GPU R&D project and designed to be detector agnostic. Traccc...
The LHCb experiment at the Large Hadron Collider (LHC) operates a fully software-based trigger system that processes proton-proton collisions at a rate of 30 MHz, reconstructing both charged and neutral particles in real time. The first stage of this trigger system, running on approximately 500 GPU cards, performs a track pattern recognition to reconstruct particle trajectories with low...
As part of the ATLAS Phase-II upgrade, the Event Filter (EF) Tracking project is exploring heterogeneous computing systems. This EF system will receive events selected by the L0 trigger and process the data from the ATLAS detector at a maximum rate of 1 MHz. FPGA-based implementations are being developed for various stages of track reconstruction algorithms such as pixel clustering, strip...
In high-energy physics experiment trigger systems, track segment seeding is a resource-intensive function. The primary reason lies in the high computational complexity of the segment-finding process— O(n³) in software implementations using nested loops, and O(n) × O(N²) in typical FPGA implementations, where n is the number of hits per detector layer in an event, and N is the number of bins...
After an intensive R&D programme, LHCb approved building of the Downstream Tracker (DWT): a system based on the ''artificial retina'' to pre-reconstruct tracks in the SciFi (the detector downstream to the magnet) at readout level during Run 4. Running before any trigger level, it has to process events at the average LHC crossing rate of 30 MHz.
The ''artificial retina'' is an extremely...
The future development projects for the Large Hadron Collider towards HL-LHC will constantly bring nominal luminosity increase, with the ultimate goal of reaching, e.g., a peak luminosity of $5 \cdot 10^{34} cm^{−2} s^{−1}$ for ATLAS and CMS experiments. This rise in luminosity will directly result in an increased number of simultaneous proton collisions (pileup), up to 200, that will pose new...
The Phase-2 Upgrade for the CMS experiment at the HL-LHC brings with it new and improved detectors, including a new tracker. It will provide the means to reconstruct higher-quality tracks in the pixel detector, especially in the endcaps (pseudorapidity |η| ≳ 2), by allowing to record hits in many more layers than what is currently possible. The Phase 2 CMS tracking at the HLT currently plans...
The High-Luminosity LHC (HL-LHC) era presents new challenges for the CMS detector. To address them, the endcap calorimeters will be replaced with a High-Granularity Calorimeter (HGCAL) that provides exceptional spatial resolution and precise timing. A new reconstruction framework, The Iterative Clustering (TICL), is being developed in CMS Software (CMSSW) to exploit HGCAL’s features and...
The High-Luminosity Upgrade for the LHC is rapidly approaching and the CMS experiment is undergoing fundamental changes to take advantage of the new physics possibilities offered by the collider’s upgrade. In this context, the event reconstruction, both offline and at the High-Level Trigger (HLT), is undergoing significant changes to fully exploit the detectors’ upgrades while aiming to...
The PV-finder algorithm employs a hybrid deep neural network to reconstruct primary vertex positions (PVs) in proton-proton collisions at the LHC. The algorithm was originally developed for use in LHCb, but it has been adapted successfully for use in the much higher pile-up environment of ATLAS. PV-finder integrates fully connected layers that do track-by-track calculations with a...
The success of neural network based tracking algorithms for high energy colliders has prompted us to explore the merits of these methods for tracking in the lower energy regime of the PANDA experiment. In this talk, I will present the current state of a tracking pipeline that has been adapted from the Exa.TrkX group and that has an interaction graph neural network at its core. It has an...
The upcoming High Luminosity phase of the Large Hadron Collider requires significant advancements in real-time data processing to handle the increased event rates and maintain high-efficiency trigger decisions. In this work, we explore the acceleration of graph neural networks on field-programmable gate arrays for fast inference within future muon trigger pipelines with O(100) ns latencies....
Over the last years, a general purpose track finding algorithm based on the combinatorial Kalman filter (CKF) has been developed for the Acts toolkit - a community-driven project that provides experiment-independent tracking algorithms written in modern C++. It has been validated and optimized with the OpenDataDetector (ODD), and the ATLAS Phase-2 Inner Tracker (ITk). The CKF shows good...
Efficient and accurate charged particle tracking is one of the most computationally demanding challenges at the High-Luminosity Large Hadron Collider (HL-LHC). Graph neural nets (GNNs) have emerged as one of the more promising solutions as they exploit correlations between nearby hits and tracks rather than treating each track independently. However, tracking GNNs can be slow to train and...
The HL-LHC upgrade of the ATLAS inner detector (ITk) brings an
unprecedented challenge, both in terms of the large number of silicon
cluster readouts and the throughput required for budget-constrained
track reconstruction. Applying Graph Neural Networks (GNNs) has been
shown to be a promising solution to this problem with competitive
physics performance at sub-second inference time.
In...
Geometric learning pipelines have achieved state-of-the-art performance in High-Energy and Nuclear Physics reconstruction tasks ike flavor tagging and particle tracking [1]. Starting from a point cloud of detector or particle-level measurements, a graph can be built where the measurements are nodes, and where the edges represent all possible physics relationships between the nodes. Depending...
Charge particle track reconstruction is the foundation of high-energy experiments. Yet, it's also the most computationally expensive part of the particle reconstruction. It's the backbone of many downstream reconstruction algorithms. The innovation in tracking reconstruction with graph neural networks (GNNs) has shown the promising capability to cope with the computing challenges posed by the...
Reconstructing particle trajectories is a significant challenge in most particle physics experiments and a major consumer of CPU resources. It can typically be divided into three steps: seeding, track finding, and track fitting. Seeding involves identifying potential trajectory candidates, while track finding entails associating detected hits with the corresponding particle. Finally, track...
Highly boosted jets represent some of the most challenging tracking conditions in modern collider experiments. At higher momentum, particles become increasing collimated in the jet core, which results in greater track density and a large degree of multiple particles sharing a single cluster, which results in traditional tracking approaches suffering either in terms of either decreased...
Accurate and efficient particle tracking is a crucial component of precise measurements of the Standard Model and searches for new physics. This task consists of two main computational steps: track finding, the identification of a subset of all hits that are due to a single particle; and track fitting, the extraction of crucial parameters such as direction and momenta. Novel solutions to...
Standard tracking pipelines are only capable of finding particles with helical trajectories, yet many theories of new physics predict particles with non-helical trajectories. Graph neural network based trackers have recently been shown to be able to find non-helical tracks when trained on specific examples, such as quirks. But unanticipated new physics may feature unexpected trajectories,...
Graph construction is an essential step in the Graph Neural Network (GNN) based tracking pipelines. The goal of the graph construction is to construct a graph that contains only true edge connections between nodes (detector spacepoints). A promising approach for the graph construction is through the metric learning, where a node embedding space is learned, and nodes are connected according to...
The new fully software-based trigger of the LHCb experiment operates at a 30 MHz data rate and imposes tight constraints on GPU execution time. Tracking reconstruction algorithms in this first-level trigger must efficiently select detector hits, group them, build tracklets, account for the LHCb magnetic field, extrapolate and fit trajectories, and select the best track candidates to filter...
The Circular Electron Positron Collider (CEPC) is a proposed collider designed to investigate Higgs boson properties with precision and explore new physics. Its Technical Design Report (TDR) is currently under development, focusing on a reference detector. The tracking detector system of this reference detecor includes a vertex detector (VTX), an inner silicon tracker (ITK), a time projection...
The ALPHA collaboration at CERN operates two machines dedicated to testing fundamental symmetries using trapped antihydrogen atoms. The ALPHA-2 experiment was built in 2012 and is optimized to perform high-precision spectroscopy on antihydrogen [1]. The ALPHA-g apparatus, completed in 2021, is designed to measure its gravitational acceleration [2]. In both instances, the physics signal is...
The RadMap Telescope is a compact radiation monitor that can characterize the radiation environment aboard spacecraft and determine the biologically relevant dose received by astronauts. Its main sensor is a tracking calorimeter made from 1024 scintillating-plastic fibers of alternating orientation read out by silicon photomultipliers. It allows the three-dimensional tracking and...
Many beyond the Standard Model (BSM) theories, such as hidden sector models, heavy neutral leptons, and neutral naturalness frameworks, predict invisible long-lived particles that travel macroscopic distances before decaying into visible Standard Model particles with a common spatial origin. Fast and accurate reconstruction of secondary vertices therefore plays a central role in ATLAS LLP...
For Run 3 of the LHC, the LHCb experiment has introduced, among others, two novel reconstruction methods. One targets long-lived particles. It uses the new fully software-based trigger operating at a 30 MHz data rate, opening a search window into previously unexplored regions of physics phase space. The BuSca (Buffer Scanner) method acquires and analyzes data in real time, extending...
The T2K experiment is a long-baseline neutrino oscillation experiment, primarily aiming to search for CP violation in the neutrino sector from the precision measurement of neutrino and antineutrino oscillations. The neutrino beam is generated at J-PARC and is measured at Super-Kamiokande, and it is also measured at near detectors to reduce systematic uncertainties. Currently, the cross-section...
In the MEG II experiment, which searches for $\mu\to e\gamma$, a cylindrical drift chamber measures positrons from muon decays. A key challenge arises from the declining positron reconstruction efficiency in the high-pileup environment, primarily due to algorithm limitations. To address this, a machine learning-based noise filtering technique has been developed. This presentation introduces...
The COMET experiment at J-PARC aims to search for the charged lepton flavor violating process of muon-to-electron conversion with unprecedented sensitivity. One of the most serious backgrounds originates from cosmic-ray muons. In particular, a track produced by a backward-going positive muon can mimic the 105 MeV/c signal electron in a cylindrical drift chamber. To address this, we developed a...
Modern beam telescopes play a crucial role in high-energy physics experiments to precisely track particle interactions. Accurate alignment of detector elements in real-time is essential to maintain the integrity of reconstructed particle trajectories, especially in high-rate environments like the ATLAS experiment at the Large Hadron Collider (LHC). Any misalignment in the detector geometry can...
We present the first release of a large-scale, fully simulated benchmark dataset using the OpenDataDetector (ODD) under high-luminosity collider conditions (aka ColliderML). The ODD integrates several advanced next-generation detector technologies to realistically capture the complexity of collisions expected at future collider experiments, notably at the High-Luminosity Large Hadron Collider...
The LHCb collaboration is currently using a pioneer system of data filtering in the trigger system, based on real-time particle reconstruction using Graphics Processing Units (GPUs). This corresponds to processing 40 Tbits/s of data and has required a huge amount of hardware and software developments. Among them, the corresponding power consumption and sustainability is an imperative matter in...
Four-dimensional trackers are devices capable of simultaneously measuring spatial and temporal coordinates with extremely good resolution (O(10 μm) and O(10 ps)) and represent a promising avenue for charged-particle tracking. The ACTS library has the capability to include time information in track and vertexing reconstruction algorithms thanks to its 6-parameter track representation. In this...
Particle track reconstruction is one of the most important and challenging tasks to be performed in the high luminosity phase of LHC experiments. Extensive research is being done to develop reconstruction methods that provide the same efficiency as the current adaptive filter methods but with enough throughput to suit the increase in the event information density of this new environment. A...
At the future colliders such as High-Luminosity LHC (HL-LHC), the average number of the simultaneous pp interactions per event, or pile-up (µ), will rise from current 30-60 to as much as 200. Among the full event simulation and reconstruction chains, reconstruction of charged particles quickly becomes the most computationally intensive chain because it scales combinatorially with an increasing...
The success of neural network based tracking algorithms for high energy colliders has prompted us to explore the merits of these methods for tracking in the lower energy regime of the PANDA experiment. In this talk, I will present the current state of a tracking pipeline that has been adapted from the Exa.TrkX group and that has an interaction graph neural network at its core. It has an...
The PV-finder algorithm employs a hybrid deep neural network to reconstruct primary vertex positions (PVs) in proton-proton collisions at the LHC. The algorithm was originally developed for use in LHCb, but it has been adapted successfully for use in the much higher pile-up environment of ATLAS. PV-finder integrates fully connected layers that do track-by-track calculations with a...
Over the last years, a general purpose track finding algorithm based on the combinatorial Kalman filter (CKF) has been developed for the Acts toolkit - a community-driven project that provides experiment-independent tracking algorithms written in modern C++. It has been validated and optimized with the OpenDataDetector (ODD), and the ATLAS Phase-2 Inner Tracker (ITk). The CKF shows good...
Reconstruction of Charged particle trajectory is a difficult task in high-energy physics, especially under some harsh conditions. By using RANSAC algorithm, We develop a track finding algorithm for low momentum electrons with multiple-turn trajectories, which could work in drift chamber with only stereo layers and without seed candidate. The performance is tested with simulation samples and...
The tracker passive material cannot be neglected within the charged particle track reconstruction algorithm. In particular, track propagation must account for material effects such as multiple scattering and energy loss. A simplistic treatment leads to biases in the reconstructed track parameters and potential reconstruction efficiency losses.
In the CMS Offline Software (CMSSW), the...
The HL-LHC upgrade of the ATLAS inner detector (ITk) brings an unprecedented challenge, both in terms of the large number of silicon cluster readouts and the throughput required for budget-constrained track reconstruction. Applying Graph Neural Networks (GNNs) has been shown to be a promising solution to this problem with competitive physics performance at sub-second inference time. In this...
The upcoming High Luminosity phase of the Large Hadron Collider requires significant advancements in real-time data processing to handle the increased event rates and maintain high-efficiency trigger decisions. In this work, we explore the acceleration of graph neural networks on field-programmable gate arrays for fast inference within future muon trigger pipelines with O(100) ns latencies....
Track-based alignment is essential for determining the precise geometry of particle detectors by minimizing the residuals between measured hits and reconstructed particle trajectories. Currently, the most widely used alignment algorithm is based on the global $\chi^2$ method. This method solves a system of linear equations derived from the first-order expansion of the alignment problem,...
This overview of the Monte Carlo (MC) simulation process at the four main experiments - ATLAS, CMS, ALICE, and LHCb - at the Large Hadron Collider (LHC) is provided in this talk. Given the complexity and noise of proton-proton collisions, simulated samples are essential for understanding and interpreting the experimental results, enabling precise measurements and searches for new physics. The...
In general, tracking requires significant computational resources, and the ATLAS experiment currently reconstructs only particles with transverse momentum above 0.5 GeV. However, in searches for new physics, such as long-lived charginos, analyzing lower momentum regions is expected to offer a novel approach. In this study, we formulate tracking as a combinatorial optimization problem and apply...
The CMS detector will undergo several key upgrades for its Phase-2 in the upcoming High-Luminosity Large Hadron Collider (HL-LHC), including the replacement of the entire silicon tracking system, which will consist of an inner tracker based on silicon pixel modules and an outer tracker made from silicon modules with strip and macro-pixel sensors. In addition to a new detector, reconstruction...
The global $\chi^2$ fitter (GX2F) is a least-squares fitter designed to determine the initial parameters of a particle track. We present a thorough mathematical derivation and implement a modern version within the widely used tracking framework ACTS (A Common Tracking Software). The GX2F provides an efficient method for accounting for material effects by fitting all scattering angles, which...
The drift chamber is applied in the BESIII experiment to provide both momentum measurement and particle identification with dE/dx for charged particles. It has just undergone an upgrade of the inner chamber to solve the aging problem. A cosmic ray test has been performed at the beginning of this year for the detector commissioning and the software alignment. A track reconstruction algorithm...
As the High-Luminosity LHC (HL-LHC) era approaches, significant improvements in reconstruction software are required to keep pace with the increased data rates and detector complexity. Some promising algorithms for high-throughput event reconstruction are GPU-based algorithms. Among those algorithms, the Kalman Filter is commonly used for the estimation of track parameters. While it leads to a...
Reconstructing charged particle trajectories is a fundamental challenge in high-energy physics, particularly with the increased particle multiplicities anticipated at the High-Luminosity Large Hadron Collider (HL-LHC). Traditional tracking algorithms may struggle with the computational demands posed by these conditions. In our study, we introduce an innovative approach that leverages recent...