At the High Luminosity Large Hadron Collider (HL-LHC), up to 200 proton-proton collisions happen during a single bunch crossing. This leads on average to tens of thousands of particles emerging from the interaction region. The CPU time of traditional approaches of constructing hit combinations will grow exponentially as the number of simultaneous collisions increases at the HL-LHC, posing a...
Scintillating-fibre detectors are high-efficiency, fast readout tracking devices employed through high-energy particle physics, for instance the SciFi tracker in the LHCb upgrade. The hybrid seeding is a stand-alone track reconstruction algorithm for the SciFi. This algorithm is designed in an iterative way, where tracks with a higher momentum, which are easier, are treated in priority. With...
The physics program of the LHCb experiment depends on an efficient and
precise reconstruction of the primary vertices produced by proton-proton collisions.
The LHCb Upgrade detector, starting to take data in 2021 with a fully software-
based trigger, requires an online reconstruction at a rate of 30 MHz, necessitating
fast vertex finding algorithms. We present a new approach to...
The first LHCb upgrade will take data at an instantaneous luminosity of 2E33cm^{-2}s^{-1} starting in 2021. Due to the high rate of beauty and charm signals LHCb has chosen as its baseline to read out the entire detector into a software trigger running on commodity x86 hardware at the LHC collision frequency of 30MHz, where a full offline-quality reconstruction will be performed. In this talk...
The High Luminosity Large Hadron Collider is expected to have a 10 times higher readout rate than the current state, significantly increasing the computational load required. It is then essential to explore new hardware paradigms. In this work we consider the Optical Processing Units (OPU) from [LightOn][1], which compute random matrix multiplications on large datasets in an analog, fast and...
The unprecedented increase of complexity and scale of data is expected in the necessary computation for tracking detectors of the High Luminosity Large Hadron Collider (HL-LHC) experiments. While currently used Kalman filter based algorithms are reaching their limits in terms of ambiguities from increasing number of simultaneous collisions, occupancy, and scalability (worse than quadratic), a...
In the transition to Run 3 in 2021, LHCb will undergo a major luminosity upgrade, going from 1.1 to 5.6 expected visible Primary Vertices (PVs) per event, and it will adopt a purely software trigger. We present an improved hybrid algorithm for vertexing in the upgrade conditions. We use a custom kernel to transform the sparse 3D space of hits and tracks into a dense 1D dataset, and then apply...
The upgraded LHCb detector will begin taking data in 2021 with a triggerless readout system. As a result the full 30 MHz inelastic collision rate will be processed using a software-only High Level Trigger (HLT). This will allow for dramatic improvements in LHCb's ability to study beauty and charm hadron decays, but also presents an extraordinary technical challenge and has prompted the study...
The Mu2e experiment at Fermilab searches for the charged-lepton flavor violating conversion of a negative $\mu$ into an $e^-$ in the field of an Al nucleus. The Mu2e goal is to improve by four orders of magnitude the current best limit on the search sensitivity. The main detector consists of a 3.2 m long straw-tube tracker and a crystal calorimeter housed in a 1 T superconducting...
Clustering of charged particle tracks along the beam axis is the first step in reconstructing the positions of hadronic interactions, also known as primary vertices, at hadron collider experiments. We demonstrate the use of a 2036 qubit D-Wave quantum annealer to perform track clustering in a limited capacity on artificial events where the positions of primary vertices and tracks resemble...
The electronics of the CMS (Compact Muon Solenoid) DT (Drift Tubes) chambers will need to be replaced for the HL-LHC (High Luminosity Large Hadron Collider) operation due to the increase of occupancy and trigger rates in the detector, which cannot be sustained by present system. A system is being designed that will forward asynchronously the totality of the chambers signals to the control...
High event rates of up to 20 MHz and continuous detector readout makes the event filtering at PANDA a challenging task. In addition, no hardware-based event selection will be possible due to the similarities between signal and background. PANDA is among a new generation of experiments utilizing a fully software-based event filtering. Currently, detector hits are often pre-sorted into events by...
In the most recent year of data-taking with the ATLAS detector at the Large Hadron Collider (LHC), the minimum pT of reconstructed tracks was 500 MeV. This bound was set to reduce the amount of combinatorial problem solving required and to save disk space, which is a challenge in high pileup environments. However, most proton-proton collisions at the LHC will result in a large number of soft...
The high instantaneous luminosity conditions in the High Luminosity Large Hadron Collider (HL-LHC) pose major computational challenges for the collider experiments. One of the most computationally challenging components is the reconstruction of charged-particle tracks. In order to efficiently operate under these conditions, it is crucial that we explore new and faster methods or...
The success of the CMS physics program at the HL-LHC requires maintaining sufficiently low trigger thresholds to select processes at the electroweak scale. With an average expected 200 pileup interactions, critical to achieve this goal while maintaining manageable trigger rates is in the inclusion of tracking in the L1 trigger. A 40 MHz silicon-based track trigger on the scale of the CMS...
A highly interesting, but difficult to trigger on, signature for Beyond Standard Model searches is massive long-lived particles decaying inside the detector volume. Current detectors and detection methods optimised for detecting prompt decays and rely on indirect, additional energetic signatures for online selection of displaced events during data-taking. Improving the trigger-level detection...
During the High-Luminosity Phase 2 of LHC, scheduled to start in 2026, the ATLAS detector is expected to collect more than 3 ab$^{-1}$ of data at an instantaneous luminosity reaching up to $7.5×10^{34}~\mathrm{cm}^{-2}.\mathrm{s}^{-1}$, corresponding to about 200 inelastic proton-proton collisions per bunch crossing. In order to cope with the large radiation doses and to maintain the physics...
The tracking system of Belle II consists of a silicon vertex detector (VXD) and a cylindrical drift chamber (CDC), both operating in a magnetic field created by the main solenoid of 1.5 T and final focusing magnets. The tracking algorithms employed at Belle II are based on a standalone reconstruction in SVD and CDC as well as on a combination of the two approaches, they employ a number of...
We present recent results of the R&D for a novel 4D fast tracking system based on rad-hard pixel sensors and front-end electronics capable of reconstructing four dimensional particle trajectories in real time. Particularly relevant results are: i) timing resolution of 30 ps for 55 micron pitch 3D silicon pixel sensors measured in a recent beam test, ii) design and production of front-end...
During the High-Luminosity Phase 2 of LHC, up to 200 simultaneous inelastic proton-proton collisions per bunch crossing are expected. This poses a significant challenge for the track reconstruction and its associated computing requirements due to the unprecedented number of particle hits in the tracker system. In order to tackle this issue, dedicated algorithms have been developed in order to...
The expected increase in simultaneous collisions creates a challenge for accurate particle track reconstruction in High Luminosity LHC experiments. Similar challenges can be seen in non-HEP trajectory reconstruction use-cases, where tracking and track evaluation algorithms are used. High occupancy, track density, complexity and fast growth therefore exponentially increase the demand of...
Flavour Tagging is a major client for tracking in particle physics experiments at high energy colliders, where it is used to identify the experimental signatures of heavy flavour production. Among other features, charm and beauty hadron decays produce jets containing several tracks with large impact parameter. This work introduces a new architecture for Flavour Tagging, based on Deep Sets,...
To address the unprecedented scale of HL-LHC data, the Exa.TrkX (previously HEP.TrkX) project has been investigating a variety of machine learning approaches to particle track reconstruction. The most promising of these solutions, graph neural networks (GNN), process the event as a graph that connects track measurements (detector hits corresponding to nodes) with candidate line segments...
We will present the implementation of a kinematic Kalman filter-based track fit. The kinematic fit uses time as the free parametric variable to describe the charged particle’s path through space, and as an explicit fit parameter (t0). The fit coherently integrates measurements from sensors where position is encoded as time (ie drift cells) with pure time sensors and geometric (solid-state)...
Efficient agglomerative clustering is reliant on the ability to exploit useful lower-order information contained within data, yet many real-world datasets do not consist of features which are naturally amenable to metric functions as required by these algorithms. In this work, we present a framework for learning representations which contain such metric structure, allowing for efficient...
Since April 2019, the data taking phase of the completed Belle II detector at
SuperKEKB has started. The high beam currents and the nanobeam scheme
of SuperKEKB demand an efficient first level trigger system to reject the dominant background from outside of the interaction region before being filtered further by
accurate, but more time-consuming software algorithms. The Neural z vertex...
sPHENIX is a new experiment being constructed at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory. The primary physics goals of sPHENIX are to measure jets, their substructure, and the upsilon resonances in both p+p and Au+Au collisions. To realize these goals, a tracking system composed of a time projection chamber and several silicon detectors will be used to...
ATLAS event reconstruction requires the identification and selection of a hard-scatter (HS) primary vertex among the multiple interaction vertices reconstructed in each event. In Run 3, the HS vertex candidate is selected based on the largest sum of squared transverse momenta over the associated tracks. While this method works very well in events containing hard physics objects within the...
In LHC Run 3, ALICE will increase the data taking rate significantly to 50 kHz continuous readout of minimum bias Pb-Pb collisions.
The reconstruction strategy of the online offline computing upgrade foresees a first synchronous online reconstruction stage during data taking enabling detector calibration, and a posterior calibrated asynchronous reconstruction stage.
The main challenges...
A 40 MHz scouting system at CMS would provide fast and virtually unlimited statistics for detector diagnostics, alternative luminosity measurements and, in some cases, calibrations, and it has the potential to enable the study of otherwise inaccessible signatures, either too common to fit in the L1 accept budget, or with requirements which are orthogonal to “mainstream” physics, such as...
To sustain the harsher conditions of high luminosity LHC in 2026, the CMS experiment has designed a novel endcap calorimeter that uses silicon sensors to achieve radiation tolerance, with the additional benefit of a very high readout granularity. In regions characterised by lower radiation levels, small scintillator tiles with individual SiPM readout are employed. A novel reconstruction...
The Kalman Filter approach to fitting charged particle trajectories is widespread in modern complex tracking systems. At the same time, the global fit of the detector geometry using Newton-Raphson fitted tracks remains the baseline method to achieve efficient and reliable track-based alignment which is free from weak-mode biases affecting physics measurements. A brief reminder of the global...
The Belle II experiment at the B-Factory SuperKEKB in Tsukuba, Japan performs precision tests of the standard model and searches for new physics. Due to the high luminosity and beam currents, Belle II faces severe rates of background tracks displaced from the collision region, which have to be rejected within the tight timing constraints of the first level trigger. To this end, a novel neural...
The “Artificial Retina” is a highly-parallelized tracking architecture that promises high-throughput, low-latency, and low-power when implemented in state-of-art FPGA devices.
Working on not-builded events, the “Artificial Retina” needs a dedicated distribution network of large bandwidth and low latency, delivering to every FPGA the hits required to perform track reconstruction. This is a...
The reconstruction of particle trajectories and their associated vertices is an essential task in the event reconstruction of most high energy physics experiments.
In order to maintain or even improve upon the current performance of tracking and vertexing algorithms under the upcoming challenges of increasing energies and ever increasing luminosities in the future, major software upgrades are...
A fast hardware based track trigger is being developed in ATLAS for the High Luminosity upgrade of the Large Hadron Collider (HL-LHC). The goal is to provide the high-level trigger with full-scan tracking at 100 kHz and regional tracking at 1 MHz, in the high pile-up conditions of the HL-LHC. A method for fast pattern recognition using the Hough transform is investigated. In this method,...
This talk will discuss work carried out by the Exa.TrkX collaboration to explore the application of Graph Neural Network (GNN)-based techniques for reconstructing particle interactions in wire-based Liquid Argon Time Projection Chambers (LArTPCs). LArTPC detector technology is utilised by many neutrino experiments, including future flagship US neutrino experiment DUNE, and techniques for fully...
A Liquid Argon Time Projection Chamber (LArTPC) is type of particle imaging detectors that can record an image of charged particle trajectories with high (~mm/pixel) spatial resolution and calorimetric information. In the intensity frontier of high energy physics, LArTPC is a detector technology of choice for number of experiments including Short Baseline Neutrino program and Deep Underground...
Machine learning (ML) techniques, in particular deep neural networks (DNNs) developed in the field of Computer Vision, have shown promising results to address the challenge of analyzing data from a big, high resolution particle imaging detector such as Liquid Argon Time Projection Chambers (LArTPCs), employed in accelerator-based neutrino experiments including Short Baseline Neutrino (SBN)...
The MicroBooNE experiment employs a Liquid Argon Time Projection Chamber (LArTPC) detector to measure sub-GeV neutrino interactions from the muon neutrino beam produced by the Booster Neutrino Beamline at Fermilab. The detector consists of roughly 90 tonne of liquid argon in which 3D trajectories of charged particles are recorded by combining timing with information from 3 wire planes, each...
Charged particle tracking is the most computationally intensive step of event reconstruction at the LHC. Due to the computational cost, the current CMS online High Level Trigger only performs track reconstruction in detector regions of interest identified by the hardware trigger or other detector elements. We have made significant progress towards developing a parallelized and vectorized...
The reconstruction of charged particles’ trajectories is one of the most complex and CPU consuming parts of event processing in high energy experiments, in particular at future hadron colliders such as the High-Luminosity Large Hadron Collider (HL-LHC). Highly-performant tracking software exploiting both innovative tracking algorithms and modern computing architectures with many cores and...
We study a method to reconstruct a nonlinear manifold embedded in Euclidean space from point cloud data using only linear approximations. Such an approximation is possible by warping the submanifold via an embedding to a higher dimensional Euclidean space. The subsequent reduction in the curvature can be justified using techniques from geometry. The immediate use of this formalism is in...
The Large Hadron Collider has an enormous potential of discovering physics beyond the Standard Model, given the unprecedented collision energy and the large variety of production mechanisms that proton-proton collisions can probe. Unfortunately, only a small fraction of the produced events can be studied, while the majority of the events are rejected by the online filtering system. One is then...
Track finding is a critical and computationally expensive step of object reconstruction in LHC detectors. The current method of track reconstruction is a physics-inspired Kalman Filter guided combinatorial search. This procedure is highly accurate but is sequential and thus scales poorly with increased luminosity like that planned for the HL-LHC. It is therefore necessary to consider new...
The design of tracking detectors for high energy physics is an extremely complicated problem. A variety of orthogonal inputs have to be taken into account: the available detector volume, technology, environment, stability, budgetary constraints and - last but not least - the physics performance. Several of these components involve careful evaluation (partly using simulation studies, partly...
The Deep Underground Neutrino Experiment (DUNE) is an
international collaboration focused on studying neutrino oscillation
over a long baseline (1300 km). DUNE will make use of a near detector
and neutrino beam originating at Fermilab in Batavia, IL, and a far
detector operating 1.5 km underground at the Sanford Underground
Research Facility in Lead, South Dakota. The near and far...
A major challenge for the high-luminosity upgrade of the CERN LHC is to determine the interaction vertex of the hard scattering process from the 200 simultaneous interactions (pileup) that are expected to occur in each bunch crossing. To meet this challenge, the upgrade of the CMS experiment comprises of a complete replacement of the silicon tracker that will allow for the first time the...
The VXDTF2 (VerteX Detector Track Finder 2nd) is the first implementation of a sector-map based track finder to be used on high energy Physics data, namely on the data collected by the Belle II experiment that is now recording the e+ e- collisions produced at the second generation B-factory SuperKEKB in KEK (Tsukuba). The main concepts of the algorithm and the design choices of the VXDTF2...
High energy hadronic interactions at the LHC provide ATLAS with a wide variety of physics signatures. Due to the high centre-of-mass energy tracks in signatures such as hadronic jets and tau lepton decays can become highly collimated.
The separation between these tracks can become smaller than the ATLAS Inner Detector sensitive elements resulting in a loss of reconstruction efficiency.
In...