The LHC accelerator is running at unprecedented high instantaneous
luminosities, allowing the experiments to collect a vast amount of
data. However this ashonishing performance comes with a
larger-than-designed number of interactions per crossing of proton
bunches (pile-up). During 2017 values up to 60 interactions per bunch
crossing were routinely achieved and capped by the ability...
The projected proton beam intensity of the High Luminosity Large Hadron Collider (HL-LHC), slated to begin operation in 2026, will result in between 140 and 200 concurrent proton-proton interactions per 25 ns bunch crossing. The scientific program of the HL-LHC, which includes precision Higgs coupling measurements, measurements of vector boson scattering, and searches for new heavy or exotic...
Noise of non-astrophysical origin contaminates science data taken by the Advanced Laser Interferometer Gravitational-wave Observatory and Advanced Virgo gravitational-wave detectors. Characterization of instrumental and environmental noise transients has proven critical in identifying false positives in the first observing runs. Machine-Learning techniques have, in recent years, become more...
The Fast Tracker (FTK) is a hardware upgrade to the ATLAS trigger and data acquisition system providing global track reconstruction to the High-Level Trigger (HLT) with the goal to improve pile-up rejection. The FTK processes incoming data from the Pixel and SCT detectors (part of the Inner Detector, ID) at up to 100 kHz using custom electronic boards. ID hits are matched to pre-defined track...
Machine learning methods are becoming ubiquitous across the LHC and particle physics. However, the exploration of such techniques within the field in low latency, low power FPGA hardware has only just begun. There is great potential to improve trigger and data acquisition performance, more generally for pattern recognition problems, and potentially beyond. We present a case study for using...
With the high luminosity upgrade of LHC, incorporating tracking information into the CMS Level-1 trigger becomes necessary in order to maintain a manageable trigger rate. The main challenges Level-1 track finding faces are the large data throughput from the detector at the collision rate of 40 MHz and 4 μs time budget to reconstruct charged particle tracks with sufficiently low transverse...
Many of the physics goals of ATLAS in the High Luminosity LHC era,
including precision studies of the Higgs boson, require an unprescaled
single muon trigger with a 20 GeV threshold. The selectivity of the
current ATLAS first-level muon trigger is limited by the moderate
spatial resolution of the muon trigger chambers. By incorporating the
precise tracking of the MDT, the muon transverse...
Data Quality Assurance (QA) is an important aspect of every High-Energy Physics experiment, especially in the case of the ALICE experiment at the Large Hadron Collider (LHC) whose detectors are extremely sophisticated and complex devices. To avoid processing low quality or redundant data, and to classify it for analysis, human experts are currently involved in an offline assessment of the...
In the track reconstruction in the CMS software, particle tracks are determined using a Combinatorial Track Finder algorithm. In order to optimize the speed and accuracy of the algorithm, the tracks are reconstructed using an iterative process: Easiest tracks are searched first, then hits, associated to good found tracks, are excluded from consideration in the following iterations (masking)...
During the LHC Run III, starting in 2020, the instantaneous luminosity of LHCb will be increased up to $2\times10^{33}$ cm$^{-2}$ s$^{-1}$, five times larger than in Run II. The LHCb detector will then have to be upgraded in 2019. In fact, a full software event reconstruction will be performed at the full bunch crossing rate by the trigger, in order to profit of the higher instantaneous...
The alignment of the ATLAS Inner Detector is performed with a track-based alignment algorithm.
Its goal is to provide an accurate description of the detector geometry such
that track parameters are accurately determined and free from biases.
Its software implementation is modular and configurable,
with a clear separation of the alignment algorithm from the detector system specifics and the...
Machine learning in high energy physics relies heavily on simulation for fully supervised training. This often results in sub-optimal classification when ultimately applied to (unlabeled) data. At CTD2017, we showed how to avoid this problem by training directly on data using as input the fraction of signal and background in each training sample. We now have a new method that does not even...
Background
Proton CT is a prototype imaging modality for the reconstruction of the Proton Stopping Power inside a patient for more accurate calculations of the dose distributions in proton therapy treatment dose planplanning systems. A prototype proton CT system, called the Digital Tracking Calorimeter (DTC) is currectly under development where aluminum energy absorbers are sandwiched...
The High Luminosity LHC (HL-LHC) plans to increase the LHC dataset by an order of magnitude, increasing the potential for new physics discoveries. The HL-LHC upgrade, planned for 2025 will increase the peak luminosity to 7.5×10^34cm^-2s^-1, corresponding to ~200 inelastic proton-proton collisions per beam crossing. To mitigate the increased radiation doses and pileup, the ATLAS Inner Detector...
With the planned addition of the tracking information in the Level 1 trigger in CMS for the HL-LHC, the algorithms for Level 1 trigger can be completely reconceptualized. Following the example for offline reconstruction in CMS to use complementary subsystem information and mitigate pileup, we explore the feasibility of using Particle Flow-like and pileup per particle identification techniques...
Silicon tracking detectors can record the charge in each channel (analog or digitized) or have only binary readout (hit or no hit). While there is significant literature on the position resolution obtained from interpolation of charge measurements, a comprehensive study of the resolution obtainable with binary readout is lacking. It is commonly assumed that the binary resolution is...
The pixel detectors for the High Luminosity upgrades of the ATLAS and CMS detectors will preserve digitized charge information in spite of extremely high hit rates. Both circuit physical size and output bandwidth will limit the number of bits to which charge can be digitized and stored. We therefore study the effect of the number of bits used for digitization and storage on single and...
Tracking in high density environments, particularly in high energy jets, plays an important role in many physics analyses at the LHC. In such environments, there is significant degradation of track reconstruction performance. Between runs 1 and 2, ATLAS implemented an algorithm that splits pixel clusters originating from multiple charged particles, using charge information, resulting in the...
Strong gravitational lensing is a phenomenon in which images of distant galaxies appear highly distorted due to the deflection of their light rays by the gravity of other intervening galaxies. We often see multiple distinct arc-shaped images of the background galaxy around the intervening (lens) galaxy, like images in a funhouse mirror. Strong lensing gives astrophysicist a unique opportunity...
The IceCube Neutrino Observatory is a Cherenkov detector deep in the Antarctic ice. Due to limited computational resources and the high data rate, only simplified reconstructions restricted to a small subset of data can be run on-site at the South Pole. However, in order to perform online analyses and to issue real-time alerts, fast and powerful reconstructions are desired.
Recent advances,...
Jet substructure techniques play a critical role in ATLAS in searches for new physics, and are being utilized in the trigger. They become increasingly important in detailed studies of the Standard Model, among them the inclusive search for the Higgs boson produced with high transverse momentum decaying to a bottom-antibottom quark pair. To date, ATLAS has mostly focused on the use of...
The High Luminosity LHC (HL-LHC) aims to increase the LHC data-set by an order of magnitude in order to increase its potential for discoveries. Starting from the middle of 2026, the HL-LHC is expected to reach the peak instantaneous luminosity of $7.5\cdot10^{34}cm^{-2}s^{-1}$ which corresponds to about 200 inelastic proton-proton collisions per beam crossing. To cope with the large radiation...
The Fast TracKer (FTK) within the ATLAS trigger system provides global track reconstruction for all events passing the ATLAS Level 1 trigger by dividing the detector into parallel processing pipelines that implement pattern matching in custom integrated circuits and data routing, reduction, and parameter extraction in FPGAs. In this presentation we will describe the implementation of a...
In the era of the High-Luminosity Large Hadron Collider (HL-LHC), one of the most computationally challenging problems is expected to be finding and fitting particle tracks during event reconstruction. The algorithms currently in use at the LHC are based on Kalman filter techniques, which are known to be robust and provide good physics performance. Given the need for improved computational...
In LHC Run 3, ALICE will increase the data taking rate significantly to 50 kHz continuous read out of minimum bias Pb-Pb collisions.
The reconstruction strategy of the online offline computing upgrade foresees a first synchronous online reconstruction stage during data taking enabling detector calibration, and a posterior calibrated asynchronous reconstruction stage.
We present a tracking...
Belle II - located at the $e^+e^-$ collider SuperKEKB operating at the $\Upsilon (4\mathrm S)$ energy - starts its first data taking run in February 2018.
Its ultimate goal is to measure with high precision multifaceted quantities in the flavor-sphere and explore the many opportunities beyond, e.g. exotic hadronic states, afforded by its record-breaking instantaneous luminosity of $8\cdot...
The track reconstruction task of ATLAS and CMS will become computationally increasingly challenging with the LHC high luminosity upgrade. In the context of taking advantage of machine learning techniques, a clustering algorithm is proposed to group together hits that belong to the same particle. Clustering is referred to as unsupervised classification and is widely applied to big data. The...
The high particle densities produced by the Large Hadron Collider (LHC) mean that in the ATLAS pixel detector the clusters of deposited charge start to merge. A neural network-based approach is used to estimate the number of particles contributing to each cluster, and to accurately estimate the hit positions even in the presence of multiple particles. This talk or poster will thoroughly...
Tracking in high density environments, particularly in high energy jets, plays an important role in many physics analyses at the LHC. In such environments, there is significant degradation of track reconstruction performance. Between runs 1 and 2, ATLAS implemented an algorithm that splits pixel clusters originating from multiple charged particles, using charge information, resulting in the...
Tracking in dense environments, such as in the cores of high-energy jets, will be key for new physics searches as well as measurements of the Standard Model at the High Luminosity LHC (HL-LHC). The HL-LHC will operate in challenging conditions with large radiation doses and high pile-up (up to $\mu$=200). The current tracking detector will be replaced with a new all-silicon Inner Tracker for...
The Coherent Muon to Electron Transition (COMET) experiment is designed to search for muon to electron conversion, a process which has very good sensitivity to Beyond the Standard Model physics. The first phase of the experiment is currently under construction at J-PARC. This phase is designed to probe muon to electron conversion 100 times better than the current limit. The experiment will...
The data input rates foreseen in High-Luminosity LHC (circa 2026) and High-Energy LHC (2030s) High Energy Physics (HEP) experiments impose new challenging requirements on data processing. Polynomial algorithmic complexity and other limitations of classical approaches to many central HEP problems induce searches for alternative solutions featuring better scalability, higher performance and...
Conformal tracking is the novel and comprehensive tracking strategy adopted by the CLICdp Collaboration. It merges the two concepts of conformal mapping and cellular automaton, providing an efficient pattern recognition for prompt and displaced tracks, even in busy environments with 3 TeV CLIC beam-induced backgrounds. In this talk, the effectiveness of the algorithm will be shown by...
The design of next-generation particle accelerators evolves to higher and higher luminosities, as seen in the HL-LHC upgrade and the plans for the Future Circular Collider (FCC). Writing track reconstruction software that can cope in these high-pileup scenarios is a big challenge, due to the inherent complexity of current algorithmic approaches. In this contribution we present TrickTrack, a...
Reconstructing charged particles trajectories is a central task in the reconstruction of most particle physics experiments. With increasing intensities and ever increasing track densities this combinatorial problem becomes increasingly challenging. Preserving physics performance in these difficult experimental conditions while at the same keeping the computational cost at a reasonable level,...
For the past year, the HEP.TrkX project has been investigating machine learning solutions to LHC particle track reconstruction problems. A variety of models were studied that drew inspiration from computer vision applications and operated on an image-like representation of tracking detector data. While these approaches have shown some promise, image-based methods face challenges in scaling up...
In order to overcome the difficulty brought by the curling charged tracks in the BESIII drift chamber, we introduce the Hough transform based tracking method, which is used as a supplementary to find low transverse momentum tracks. This tracking algorithm is realized in the BESIII offline software system and its performance has been checked by both Monte Carlo and data. The results show that...
The reconstruction of multi-turn curling tracks on COMET Phase-I drift chamber is a challenge. A method of Deterministic Annealing Filter and implements a global competition between hits from different turn tracks is introduced. This method assigns the detector measurements to the track assumption based on the weighted mean of fitting quality on different turns. Studied have been done on the...
The pixel detector in the CMS experiment has been upgraded with additional 4th barrel layer and 3rd forward disk while maintaining same pixel dimension (150 x 100 μm2). Due to large volume of data from pixel detector, the processing power of the HLT CPUs is not sufficient to reconstruct tracks from all events. However, many trigger paths would benefit from pixel tracks to increase their...
In the CDR study of the CEPC project, tracking algorithms and their performances are important task. For considering different design of tracker system, we are implementing corresponding tracking algorithms. Currently, we apply existing Clupatra as tracking for TPC, and also are exploiting ArborTracking at the same time. We attempt to use existing ConformalTracking as tracking for the design...
Recently we showed that deep learning can be used for model parameter estimation for strong gravitational lensing systems. Here we demonstrate a method for obtaining the uncertainties of these parameters. We use variational inference to obtain approximate posteriors of Bayesian neural networks and apply it to a network trained to estimate the parameters of the Singular Isothermal Ellipsoid...