-
Martin Beyer25/05/2026, 13:45Track 3 - Offline data processingOral Presentation
The Compressed Baryonic Matter experiment (CBM) at FAIR is designed to explore the QCD phase diagram at high baryon densities with interaction rates up to 10 MHz using triggerless free-streaming data acquisition. The CBM Ring Imaging Cherenkov detector (RICH) contributes to the overall PID by identification of electrons from the lowest momenta up to 6-8 GeV/c, with a pion suppression factor of...
Go to contribution page -
Wahid Redjeb (CERN)25/05/2026, 14:03Track 3 - Offline data processingOral Presentation
The increase in luminosity and pileup at the High-Luminosity LHC (HL-LHC) will place unprecedented demands on the CMS experiment, requiring major advances in both detector technology and event reconstruction. Among the planned upgrades, the High-Granularity Calorimeter (HGCAL) will replace the current endcap calorimeters, providing fine spatial segmentation and precision timing. These features...
Go to contribution page -
Jose Daniel Gaytan Villarreal (Carnegie-Mellon University (US))25/05/2026, 14:21Track 3 - Offline data processingOral Presentation
We present the first application of a one-pass, machine learning based imaging calorimeter reconstruction approach to the latest full CMS High Granularity Calorimeter (HGCAL) simulation. The model is a Graph Neural Network that directly processes the hits in the HGCAL, one of the most important upgrades of the Compact Muon Solenoid detector in preparation for the High-Luminosity phase of the...
Go to contribution page -
Dr Guang Zhao (Institute of High Energy Physics (CAS))25/05/2026, 14:39Track 3 - Offline data processingOral Presentation
Particle identification (PID) is essential for future particle physics experiments such as the Circular Electron-Positron Collider and the Future Circular Collider. A high-granularity Time Projection Chamber (TPC) not only provides precise tracking but also enables dN/dx measurements for PID. The dN/dx method estimates the number of primary ionization electrons, offering significant...
Go to contribution page -
Matthieu Martin Melennec (Centre National de la Recherche Scientifique (FR))25/05/2026, 14:57Track 3 - Offline data processingOral Presentation
One of the major difficulties of particle reconstruction in calorimeters is the case of overlapping objects in the detector. This problem will become particularly concerning at the High-Luminosity LHC, where the increased luminosity will cause high levels of pile-up. High-granularity calorimeters, such as the future HGCal in the CMS endcap, allow us to perform Particle Flow (PF) reconstruction...
Go to contribution page -
Aashay Arora (Univ. of California San Diego (US))25/05/2026, 16:15Track 3 - Offline data processingOral Presentation
High-pileup conditions in CMS during the HL-LHC era make charged-particle tracking increasingly challenging as detector occupancy and combinatorics grow. We present a hybrid approach that exploits Line Segment Tracking (LST) objects rather than individual hits to enable the first CMS ML-based track reconstruction algorithm. The LST segments are built according to geometry- and physics-driven...
Go to contribution page -
Carlo Varni (AGH University of Krakow (PL)), Krzysztof Cieลla (AGH University of Krakow (PL)), Marcin Wolter (Polish Academy of Sciences (PL)), Tomasz Bold (AGH University of Krakow (PL))25/05/2026, 16:33Track 3 - Offline data processingOral Presentation
Reconstructing charged-particle tracks in silicon detectors is one of the most computationally demanding tasks in high-energy physics. When applied in online event selection systems, additional latency constraints make the problem even more challenging. Within the reconstruction chain, the efficient and high-purity formation of track candidates plays a critical role in the overall...
Go to contribution page -
Marilena Bandieramonte (University of Pittsburgh (US))25/05/2026, 16:51Track 3 - Offline data processingOral Presentation
In response to the rising computational and storage demands of the High-Luminosity Large Hadron Collider (HL-LHC), efforts are underway to boost the processing efficiency of ATLAS Inner Detector (ID) event reconstruction. Our strategy to reduce the computational demands employs a Track-Overlay approach, which uses pre-reconstructed pile-up tracks (from separate minimum-bias simulations) and...
Go to contribution page -
Yunhe Yang (Nankai University), Xinyu Zhuang25/05/2026, 17:09Track 3 - Offline data processingOral Presentation
We present an end-to-end track reconstruction algorithm based on Graph Neural Networks (GNNs) for a 35 layers multilayer drift chamber (MDC) combined with a 3 layers cylindrical gas electron multiplier (CGEM) in the BESIII experiment at the BEPCII collider. The algorithm directly processes MDC wire measurement and CGEM cluster as input to simultaneously predict the number of track candidates...
Go to contribution page -
ๅ ่ฝฒ ๅผ (ไธญๅฝ็งๅญฆ้ข้ซ่ฝ็ฉ็็ ็ฉถๆ)25/05/2026, 17:27Track 3 - Offline data processingOral Presentation
The COMET experiment is designed to search for charged lepton flavor violation (CLFV) through coherent muon-to-electron conversion, characterized by a 105โฏMeV electron signal. In PhaseโฏI, an allโstereoโlayer Cylindrical Drift Chamber (CDC) is used as the main tracker for chargedโparticle measurement. A key challenge is that all the signal tracks are curled and about oneโthird of the tracks in...
Go to contribution page -
Jay Chan (Lawrence Berkeley National Lab. (US))25/05/2026, 17:45Track 3 - Offline data processingOral Presentation
Graph Neural Networks (GNNs) are a leading approach for particle track reconstruction, typically following a three-step pipeline: graph construction, edge classification, and graph segmentation. In edge-classification pipelines like ACORN, the segmentation step is often a trade-off between the speed of local algorithms (e.g., Junction Removal) and the accuracy of global algorithms (e.g.,...
Go to contribution page -
Jiarui Hu (IHEP)26/05/2026, 13:45Track 3 - Offline data processingOral Presentation
X-ray phase contrast imaging based on propagation is a crucial technique for achieving non-destructive detection at micro and nano scales. However, the recovery of phase information from intensity measurements presents a typical ill-posed inverse problem. Traditional iterative algorithms often necessitate multiple distance measurements, which increases both the complexity and time cost of...
Go to contribution page -
David Schultz (University of Wisconsin-Madison)26/05/2026, 13:45Track 3 - Offline data processingOral Presentation
Re-processing data with improved detector understanding, new data processing methods, etc. is natural for any particle physics experiments over the course of its life. The IceCube Neutrino Observatory has previously re-processed its data nearly a decade ago. Now we are processing the data for the third time, which we call Pass3. With this reprocessing, we have recorded three times as much...
Go to contribution page -
Prof. Daniel Nieto (IPARCOS-UCM)26/05/2026, 14:03Track 3 - Offline data processingOral Presentation
The Cherenkov Telescope Array Observatory (CTAO) represents the next generation of ground-based gamma-ray telescopes, designed to probe the very-high-energy (VHE) sky above 20 GeV with unprecedented sensitivity. With the first Large-Sized Telescope (LST-1) prototype already taking data on La Palma, robust software is required to accurately reconstruct the properties of primary particles (type,...
Go to contribution page -
Axel Naumann (CERN)26/05/2026, 14:03Track 3 - Offline data processingOral Presentation
High Energy Physics uses C++ for performance-critical, large-scale (50 million lines of code) libraries. Python is used for analysis. C++ is complex and getting more so, with industry creating a very competitive market for developers. Python is very slow but very common. Is there any way out? As part of the R&D done in the Next Generation Triggers project we are looking at novel languages that...
Go to contribution page -
Max Hart (University College London (GB))26/05/2026, 14:21Track 3 - Offline data processingOral Presentation
Modern collider detector experiments comprise of multiple different detector subsystems, each of which require dedicated reconstruction algorithms. Manually tuning these algorithms such that they work optimally not only in isolation, but also when combined together to form a full reconstruction chain, is a time consuming task that poses technical and organisational challenges. We demonstrate...
Go to contribution page -
Lucas Astrand26/05/2026, 14:21Track 3 - Offline data processingOral Presentation
Machine-learning techniques are becoming an increasingly important part of the design and physics reach of the proposed HIBEAM/NNBAR program at the European Spallation Source. Building on our previously published ML studies for particle identification and event reconstruction, we are developing a broader suite of ML tools to support detector optimization, vertex and event reconstruction, and...
Go to contribution page -
Richa Sharma (University of Puerto Rico (US))26/05/2026, 14:39Track 3 - Offline data processingOral Presentation
The CMS Pixel Detector in Run 3, with about 1400 silicon modules, is a central part of the Tracker, providing precise tracking and vertex reconstruction. Ensuring high quality data requires continuous monitoring, as modules can degrade or suffer operational issues. Traditionally, experts relied on a GUI that displayed histograms integrated over entire runs, making it difficult to spot...
Go to contribution page -
Sanjeeda Bharati Das, Sanjeeda Bharati Das (Torino University and INFN)26/05/2026, 14:39Track 3 - Offline data processingOral Presentation
The MANTRA (Measuring Anti-Neutron: Tagging and Reconstruction Algorithm for frontier experiments) is a PRIN 2022 Italian project which proposes a new method to measure the energy of anti-neutrons produced in high-energy physics experiments. Anti-neutrons cannot be reconstructed by the tracking systems; however, they can produce so-called annihilation stars in electromagnetic calorimeters,...
Go to contribution page -
Shuang Wang (IHEP)26/05/2026, 14:57Track 3 - Offline data processingOral Presentation
Astronomical satellites serve as critical infrastructure in the field of astrophysics, and data processing is one of the most essential processes for conducting scientific research on cosmic evolution, celestial activities, and dark matter. Recent advancements in satellite sensor resolution and sensitivity have led to petabyte (PB)-scale data volumes, characterized by unprecedented scale and...
Go to contribution page -
Amy Byrnes26/05/2026, 16:15Track 3 - Offline data processingOral Presentation
The High-Luminosity Large Hadron Collider (HL-LHC) is expected to produce data at the exabyte scale, motivating the exploration of new methods for reducing data volumes. Error-bounded lossy compression has been adopted in many scientific domains as an effective strategy for reducing storage and I/O costs without compromising the quality of downstream analyses.
However, selecting an...
Go to contribution page -
Prabhat Solanki (Universita & INFN Pisa (IT))26/05/2026, 16:15Track 3 - Offline data processingOral Presentation
The upgrade of the CMS apparatus for the HL-LHC will provide unprecedented timing measurement capabilities, in particular for charged particles through the Mip Timing Detector (MTD). One of the main goals of this upgrade is to compensate the deterioration of primary vertex reconstruction induced by the increased pileup of proton-proton collisions by separating clusters of tracks not only in...
Go to contribution page -
Florine Willemijn de Geus (CERN/University of Twente (NL))26/05/2026, 16:33Track 3 - Offline data processingOral Presentation
With the data deluge that is expected to come with the High-Luminosity LHC and limited storage resources, the need to reduce the on-disk file size of High-Energy Physics (HEP) data becomes even more pressing. Lossless compression algorithms and encodings are already extensively used across all experiments data tiers, leading to often significant reductions of the total on-disk data volume for...
Go to contribution page -
Ching-Hua Li (Aix Marseille Univ, CNRS/IN2P3, CPPM, Marseille, France)26/05/2026, 16:33Track 3 - Offline data processingOral Presentation
To achieve higher physics precision, the LHCb experiment is operating at an increased instantaneous luminosity in Run 3, leading to an unprecedented challenge in total data volume. A single proton-proton collision generates hundreds of tracks, yet the target signals involve only a few; this imbalance severely inflates the event data size. To efficiently reduce the event size while retaining...
Go to contribution page -
Felice Pantaleo (CERN)26/05/2026, 16:51Track 3 - Offline data processingOral Presentation
The High-Luminosity LHC will vastly increase both the volume and complexity of data to be processed within the CMS software framework (CMSSW), pushing computational throughput to its limits. Efficient use of accelerator hardware, especially GPUs, will be central to sustaining reconstruction and analysis performance under these conditions. Among the most impactful design choices for...
Go to contribution page -
Lukasz Graczykowski (Warsaw University of Technology (PL))26/05/2026, 16:51Track 3 - Offline data processingOral Presentation
Identifying products of ultrarelativistic collisions delivered by the LHC and RHIC colliders is one of the crucial objectives of experiments such as ALICE and STAR, which are specifically designed for this task. They allow for a precise Particle Identification (PID) over a broad momentum range.
Traditionally, PID methods rely on hand-crafted selections, which compare the recorded signal of...
Go to contribution page -
Valerii Kholoimov (EPFL - Ecole Polytechnique Federale Lausanne (CH))26/05/2026, 17:09Track 3 - Offline data processingOral Presentation
Long-lived particles (LLPs) are present in many Standard Model extensions and could provide solutions to long-standing problems in modern physics. In this work, machine-learning based techniques are developed to probe for the presence of such particles, specifically Heavy Neutral Leptons (HNLs) and Axion-Like Particles (ALPs), decaying in the LHCb muon detector. Their decays will produce...
Go to contribution page -
Marcin Nowak (Brookhaven National Laboratory (US))26/05/2026, 17:09Track 3 - Offline data processingOral Presentation
The ATLAS experiment has surpassed 1 exabyte of stored data, much of it managed through the Athena POOL Replacement (APR) persistency framework. Derived from the original LCG POOL project, APR has long provided a technology-independent abstraction layer that enabled seamless support for multiple backends, including ROOT TTree, TKey, and more recently RNTuple. While APR has proven remarkably...
Go to contribution page -
Tomas Raila (Vilnius University (LT))26/05/2026, 17:27Track 3 - Offline data processingOral Presentation
The High-Luminosity upgrade of the LHC (HL-LHC) will present an unprecedented computational challenge for the CMS experiment, with the average number of simultaneous proton-proton interactions (pileup) expected to reach 200 per bunch crossing. Accurately modeling this background environment requires the production of massive, high-fidelity simulated event datasets. Currently, CMS employs a...
Go to contribution page -
Aurora Perego (Universita & INFN, Milano-Bicocca (IT))26/05/2026, 17:45Track 3 - Offline data processingOral Presentation
The extreme pileup conditions expected at the High-Luminosity LHC (HL-LHC) requires new technologies to cope with the higher occupancy. One of the strategies adopted to address this challenge is the usage of precise timing information in event reconstruction. The CMS experiment will introduce two new sub detectors with timing capabilities: the MIP Timing Detector (MTD) covering both barrel and...
Go to contribution page -
David Rohr (CERN)26/05/2026, 17:45Track 3 - Offline data processingOral Presentation
ALICE is the dedicated heavy ion experiment at the LHC at CERN recording lead-lead collisions at a rate of up to 50 kHz interaction rate.
Go to contribution page
ALICE was the first LHC experiment to leverage GPUs for online data processing in LHC Runs 1 and 2, and its Run 3 online data processing scheme today is fully based on GPUs with more than 90% of the compute load offloaded to the accelerator.
In order to... -
Dr Oliver Gregor Rietmann (CERN)27/05/2026, 13:45Track 3 - Offline data processingOral Presentation
In high performance computing, we strive for algorithms on large arrays to be as performant as possible. However, the performance of such an algorithm is also affected by the memory layout of these arrays. The most natural memory layout is Array-of-Structures (AoS), which performs well for strided access patterns and for large classes. On the other hand, Structures-of-Array (AoS) allows for...
Go to contribution page -
Izaac Sanderswood (Univ. of Valencia and CSIC (ES))27/05/2026, 14:03Track 3 - Offline data processingOral Presentation
Precise reconstruction of particle decay chains is an essential tool for a wide range of analyses in particle physics experiments, particularly those focused on flavour dynamics and CP violation. We present a novel decay tree reconstruction framework designed to handle complex topologies with deeply constrained particle decays, trajectory extrapolations over long distances inside regions with...
Go to contribution page -
Noemi Calace (CERN)27/05/2026, 14:21Track 3 - Offline data processingOral Presentation
The ATLAS experiment is undertaking a major modernisation of its reconstruction software to meet the demanding conditions of High-Luminosity LHC (HL-LHC) operations. A key element of this effort is the use of the experiment-independent ACTS toolkit for track reconstruction, which requires a major redesign of several parts of the current ATLAS software. This contribution will describe the ACTS...
Go to contribution page -
Jan Stark (Laboratoire des 2 Infinis - Toulouse, CNRS / Univ. Paul Sabatier (FR))27/05/2026, 14:39Track 3 - Offline data processingOral Presentation
The High-Luminosity LHC (HL-LHC) will bring large increases in collision rate and pile-up. This represents a significant surge in both data quantity and complexity. In addition to excellent physics performance, a high computational efficiency is critical to fully exploit the HL-LHC
Go to contribution page
datasets. In response, substantial R&D efforts in machine learning (ML) have been initiated by the ATLAS... -
Sam Young27/05/2026, 14:57Track 3 - Offline data processingOral Presentation
Liquid argon time projection chambers (LArTPCs) provide dense, high-fidelity 3D measurements of particle interactions and underpin many current and future neutrino and rare-event experiments. Event reconstruction typically relies on complex detector-specific pipelines that use tens of hand-engineered pattern recognition algorithms or cascades of task-specific neural networks that require...
Go to contribution page -
Prof. Ziyan Deng27/05/2026, 16:15Track 3 - Offline data processingOral Presentation
The BESIII experiment has been operating since 2009 and has received several upgrades, to study the tau-charm physics utilizing the BEPCII accelerator. Both the BEPCII accelerator and BESIII detector have been upgraded during these years. The BESIII offline software system is developed based on Gaudi framework, provides the fundamental basis for physics analysis.
Go to contribution page
This talk focuses on the... -
Tao Lin (Chinese Academy of Sciences (CN))27/05/2026, 16:33Track 3 - Offline data processingOral Presentation
The Jiangmen Underground Neutrino Observatory (JUNO) is a multipurpose neutrino experiment designed to determine the neutrino mass ordering and to achieve high-precision measurements of neutrino oscillation parameters. Construction of the JUNO detector was completed at the end of 2024, followed by commissioning of the water phase and the subsequent liquid scintillator filling phase. Physics...
Go to contribution page -
Andrew Paul Olivier (Argonne National Laboratory)27/05/2026, 16:51Track 3 - Offline data processingOral Presentation
The Deep Underground Neutrino Experiment (DUNE) will deploy four 10 kt fiducial mass liquid argon-based tracking calorimeters to study neutrino oscillation properties, supernova neutrinos, and beyond the standard model physics. To accomplish its diverse physics program, DUNE must read out over 1000 time-samples of waveforms for each of its nearly 400,000 channels. Therefore, a DUNE data...
Go to contribution page -
Philippe Canal (Fermi National Accelerator Lab. (US))27/05/2026, 17:09Track 3 - Offline data processingOral Presentation
Over many years, ROOT users have repeatedly stumbled overโand loudly rediscoveredโthe infamous 1 GB limit on individual I/O operations, a constraint that somehow survived long past the era when anyone thought a gigabyte was โa lot.โ As experiments embraced ever-larger objects and collections, this limit became an increasingly unavoidable rite of passage. This contribution recounts the...
Go to contribution page -
Danilo Piparo (CERN)27/05/2026, 17:27Track 3 - Offline data processingOral Presentation
In this contribution we discuss the status of the ROOT project right before the LHC Long Shut Down 3.
Go to contribution page
We highlight the usage of ROOT by non-LHC communities, for example gravitational waves physics, nuclear physics, neutrino physics as well as experiments at electron colliders. In addition, the usage of ROOT in contexts such as market regulation will be discussed.
The processes by which the... -
Juan Miguel Carceller (CERN)27/05/2026, 17:45Track 3 - Offline data processingOral Presentation
In this contribution, we highlight several recent developments within Key4hep, the turnkey software stack for future collider studies. These developments cover a variety of topics, most importantly a first stable release of the common event data model format, EDM4hep, and related developments. We have also significantly enhanced the integration with external software packages such as ACTS for...
Go to contribution page -
Fabrice Le Goff (University of Oregon (US))28/05/2026, 13:45Track 3 - Offline data processingOral Presentation
During the last ten years the detector agnostic open source track reconstruction toolkit ACTS has matured to production level quality and is used in offline data taking in ATLAS, sPHENIX, FASER, and is part of many upgrade and feasibility studies within the community at large. For ATLAS, the ACTS based track reconstruction has surpassed the legacy setup for the predicted Phase-2 performance in...
Go to contribution page -
Anna Zaborowska (CERN)28/05/2026, 14:03Track 3 - Offline data processingOral Presentation
We present the first full release of ColliderML, a large-scale, fully simulated benchmark dataset for algorithm R&D and development, as well as machine-learning applications.
Go to contribution page
It is built on top of the OpenDataDetector (ODD) under high-luminosity collider conditions (ColliderML). ODD comprises a set of subsystems that are representative of future collider experiments like at the... -
Felix Schlepper (CERN, Heidelberg University (DE))28/05/2026, 14:21Track 3 - Offline data processingOral Presentation
In ALICE, LHC Run 3 marks a major step toward GPU-centric data processing.
During the synchronous (online) phase, GPUs are fully dedicated to Time Projection Chamber reconstruction and compression. During the asynchronous (offline) phase, additional reconstruction tasks can be offloaded to GPUs to improve overall computing efficiency and throughput.We report the porting of the ITS2...
Go to contribution page -
Giacomo De Pietro (Karlsruhe Institute of Technology)28/05/2026, 14:39Track 3 - Offline data processingOral Presentation
High levels of beam-induced detector noise and detector aging degrade track-finding performance in the Belle II central drift chamber, resulting in losses of both track finding efficiency and purity. This motivates the development of reconstruction approaches capable of maintaining robust performance under deteriorating detector conditions. Building on our earlier work on an end-to-end...
Go to contribution page -
Adriano Di Florio (CC-IN2P3)28/05/2026, 14:57Track 3 - Offline data processingOral Presentation
The upcoming upgrades to the Large Hadron Collider for the HL-LHC era will progressively increase the nominal luminosity, aiming to a reach peak value of $5ร10^{34} $ cm$^{-2}$ s$^{-1}$ for the ATLAS and CMS experiments. Higher luminosity will naturally lead to a larger number of protonโproton interactions occurring in the same bunch crossing, with pileup levels that may reach up to 200,...
Go to contribution page -
Sanjiban Sengupta (CERN, University of Manchester)28/05/2026, 16:15Track 3 - Offline data processingOral Presentation
Deploying machine learning models in environments with high-throughput, low-latency, and strict memory constraints is challenging, especially when these environments evolve rapidly and require simplified user-control, dependency management, and long-term maintainability. In high-energy physics, and particularly within the Trigger Systems of major LHC experiments, similar requirements arise for...
Go to contribution page -
Vakho Tsulaia (Lawrence Berkeley National Lab. (US))28/05/2026, 16:33Track 3 - Offline data processingOral Presentation
To address this challenge and prepare for the transition to large, resource-intensive ML models, we propose leveraging AthenaTriton for DAOD production, where these ML models are executed on dedicated computing resources. AthenaTriton is a tool for running ML inference as a service in Athena using the NVIDIA Triton server software.We discuss different deployment strategies for Triton servers...
Go to contribution page -
Jay Chan (Lawrence Berkeley National Lab. (US))28/05/2026, 16:51Track 3 - Offline data processingOral Presentation
The High-Luminosity LHC (HL-LHC) will impose unprecedented pile-up and throughput demands on the ATLAS offline tracking reconstruction, making computational efficiency an essential requirement alongside physics performance. We present a comprehensive study of the ATLAS GNN4ITk offline track-reconstruction pipeline, spanning graph construction, Graph Neural Network (GNN) inference, and track...
Go to contribution page -
Nathan Jihoon Kang (Argonne National Laboratory (US))28/05/2026, 17:09Track 3 - Offline data processingOral Presentation
Efficient and maintainable in-file metadata is crucial for large-scale event processing. The ATLAS experiment's Athena event-processing framework relies on complex navigational and metadata infrastructure to manage event processing across diverse workflows. As experimental demands grow, inefficiencies and redundancies in the current metadata infrastructure have constrained storage efficiency,...
Go to contribution page -
Dr Alexey Boldyrev28/05/2026, 17:27Track 3 - Offline data processingOral Presentation
The reliability and reproducibility of machine learning models are critically important for their use in automated systems. In the field of HEP, this may include detector optimization, use in blind analysis, and situations where estimates of model uncertainties are required. Building upon our previous research on developing robust model selection algorithms, we propose and comprehensively test...
Go to contribution page -
Maksym Naumchyk, Maksym Naumchyk28/05/2026, 17:45Track 3 - Offline data processingOral Presentation
Awkward Array is a widely used library in high-energy physics (HEP) for representing and manipulating nested, variable-length data in Python. Previous CHEP contributions have explored GPU acceleration for Awkward Array, demonstrating the feasibility and performance benefits of CUDA-based backend while also identifying limitations related to irregular data access, fine-grained kernel launches,...
Go to contribution page
Choose timezone
Your profile timezone: