Conveners
Track 2 – Offline Computing: ML Reconstruction & PID
- Teng Jian Khoo (Universite de Geneve (CH))
Track 2 – Offline Computing: ML and generative simulation
- Chiara Ilaria Rovelli (Sapienza Universita e INFN, Roma I (IT))
Track 2 – Offline Computing: G4 and simulation frameworks
- Chris Pinkenburg
Track 2 – Offline Computing: Lightweight simulation and optimisation
- Chris Pinkenburg
Track 2 – Offline Computing: Cross-experiment frameworks & foundations
- Chris Pinkenburg
Track 2 – Offline Computing: ML Tracking and parallelisation
- Paul James Laycock (Brookhaven National Laboratory (US))
Track 2 – Offline Computing: Reconstruction and Performance
- Chiara Ilaria Rovelli (Sapienza Universita e INFN, Roma I (IT))
The Deep Underground Neutrino Experiment (DUNE) is an international effort to build the next-generation neutrino observatory to answer fundamental questions about the nature of elementary particles and their role in the universe. Integral to DUNE is the process of reconstruction, where the raw data from Liquid Argon Time Projection Chambers (LArTPC) are transformed into products that can be...
We use Graph Networks to learn representations of irregular detector geometries and perform on it typical tasks such as cluster segmentation or pattern recognition. Thanks to the flexibility and generality of the graph architecture, this kind of network can be applied to detector of arbitrarly geometry, representing the detector elements through a unique detector identification (e.g., physical...
We study the use of interaction networks to perform tasks related to jet reconstruction. In particular, we consider jet tagging for generic boosted-jet topologies, tagging of large-momentum H$\to$bb decays, and anomalous-jet detection. The achieved performance is compared to state-of-the-art deep learning approaches, based on Convolutional or Recurrent architectures. Unlike these approaches,...
For the High Luminosity LHC, the CMS collaboration made the ambitious choice of a high granularity design to replace the existing endcap calorimeters. The thousands of particles coming from the multiple interactions create showers in the calorimeters, depositing energy simultaneously in adjacent cells. The data are analog to 3D gray-scale image that should be properly reconstructed.
In this...
Micro-Pattern Gas Detectors (MPGDs) are the new frontier in between the gas tracking systems. Among them, the triple Gas Electron Multiplier (triple-GEM) detectors are widely used. In particular, cylindrical triple-GEM (CGEM) detectors can be used as inner tracking devices in high energy physics experiments. In this contribution, a new offline software called GRAAL (Gem Reconstruction And...
The innovative Barrel DIRC (Detection of Internally Reflected Cherenkov light) counter will provide hadronic particle identification (PID) in the central region of the PANDA experiment at the new Facility for Antiproton and Ion Research (FAIR), Darmstadt, Germany. This detector is designed to separate charged pions and kaons with at least 3 standard deviations for momenta up to 3.5 GeV/c...
The LHCb detector at the LHC is a single forward arm spectrometer dedicated to the study of $b-$ and $c-$ hadron states. During Run 1 and 2, the LHCb experiment has collected a total of 9 fb$^{-1}$ of data, corresponding to the largest charmed hadron dataset in the world and providing unparalleled datatests for studies of CP violation in the $B$ system, hadron spectroscopy and rare decays, not...
The ATLAS physics program relies on very large samples of GEANT4 simulated events, which provide a highly detailed and accurate simulation of the ATLAS detector. However, this accuracy comes with a high price in CPU, and the sensitivity of many physics analyses is already limited by the available Monte Carlo statistics and will be even more so in the future. Therefore, sophisticated fast...
In High Energy Physics, simulation activity is a key element for theoretical models evaluation and detector design choices. The increase in the luminosity of particle accelerators leads to a higher computational cost when dealing with the orders of magnitude increase in collected data. Thus, novel methods for speeding up simulation procedures (FastSimulation tools) are being developed with the...
The future need of simulated events for the LHC experiments and their High Luminosity upgrades, is expected to increase dramatically. As a consequence, research on new fast simulation solutions, based on Deep Generative Models, is very active and initial results look promising.
We have previously reported on a prototype that we have developed, based on 3 dimensional convolutional Generative...
LHCb is one of the major experiments operating at the Large Hadron Collider at CERN. The richness of the physics program and the increasing precision of the measurements in LHCb lead to the need of ever larger simulated samples. This need will increase further when the upgraded LHCb detector will start collecting data in the LHC Run 3. Given the computing resources pledged for the production...
Belle II uses a Geant4-based simulation to determine the detector response to the generated decays of interest. A realistic detector simulation requires the inclusion of noise from beam-induced backgrounds. This is accomplished by overlaying random trigger data to the simulated signal. To have statistically independent Monte-Carlo events a high number of random trigger events are desirable....
The Geant4 electromagnetic (EM) physics sub-packages is an important component of LHC experiment simulations. During long shutdown 2 for LHC these packages are under intensive development and in this work we report a progress for the new Geant4 version 10.6. These developments includes modifications allowing speed-up computations for EM physics, improve EM models, extend set for models, and...
The STAR Heavy Flavor Tracker (HFT) has enabled a rich physics program, providing important insights into heavy quark behavior in heavy ion collisions. Acquiring data during the 2014 through 2016 runs at the Relativistic Heavy Ion Collider (RHIC), the HFT consisted of four layers of precision silicon sensors. Used in concert with the Time Projection Chamber (TPC), the HFT enables the...
VecGeom is a geometry modeller library with hit-detection features as needed by particle detector simulation at the LHC and beyond. It was incubated by a Geant-R&D initiative and the motivation to combine the code of Geant4 and ROOT/TGeo into a single, better maintainable piece of software within the EU-AIDA program.
So far, VecGeom is mainly used by LHC experiments as a geometry primitive...
The HL-LHC and the corresponding detector upgrades for the CMS experiment will present extreme challenges for the full simulation. In particular, increased precision in models of physics processes may be required for accurate reproduction of particle shower measurements from the upcoming High Granularity Calorimeter. The CPU performance impacts of several proposed physics models will be...
The JUNO (Jiangmen Underground Neutrino Observatory) experiment is a multi-purpose neutrino experiment designed to determine the neutrino mass hierarchy and precisely measure oscillation parameters. It is composed of a 20kton liquid scintillator central detector equipped with 18000 20’’ PMTs and 25000 3’’ PMTs, a water pool with 2000 20’’ PMTs, and a top tracker. Monte-Carlo simulation is a...
The ALICE experiment at the CERN LHC will feature several upgrades for run 3, one of which is a new inner tracking system (ITS). The ITS upgrade is currently under development and commissioning. The new ITS will be installed during the ongoing long shutdown 2.
The specification for the ITS upgrade calls for event rates of up to 100 kHz for Pb-Pb, and 400 kHz pp, which is two orders of...
The increase in luminosity foreseen in the future years of operation of the Large Hadron Collider (LHC) creates new challenges in computing efficiency for all participating experiment. To cope with these challenges and in preparation for the third running period of the LHC, the LHCb collaboration currently overhauls its software framework to better utilise modern computing architectures. This...
Software improvements in the ATLAS Geant4-based simulation are critical to keep up with the evolving hardware and increasing luminosity. Geant4 simulation currently accounts for about 50% of CPU consumption in ATLAS and it is expected to remain the leading CPU load during Run 4 (HL-LHC upgrade) with an approximately 25% share in the most optimistic computing model. The ATLAS experiment...
HEP experiments simulate the detector response by accessing all needed data and services within their own software frameworks. However, decoupling the simulation process from the experiment infrastructure can be useful for a number of tasks, amongst them the debugging of new features, or the validation of multithreaded vs sequential simulation code and the optimization of algorithms for HPCs....
The Heavy Photon Search (HPS) is an experiment at the Thomas Jefferson National Accelerator Facility designed to search for a hidden sector photon (A’) in fixed-target electro-production. It uses a silicon micro-strip tracking and vertexing detector inside a dipole magnet to measure charged particle trajectories and a fast lead-tungstate crystal calorimeter just downstream of the magnet to...
The large volume of data expected to be produced by the Belle II experiment presents the opportunity for for studies of rare, previously inaccessible processes. To investigate such rare processes in a high data volume environment necessitates a correspondingly high volume of Monte Carlo simulations to prepare analyses and gain a deep understanding of the contributing physics processes to each...
Estimations of the CPU resources that will be needed to produce simulated data for the future runs of the ATLAS experiment at the LHC indicate a compelling need to speed-up the process to reduce the computational time required. While different fast simulation projects are ongoing (FastCaloSim, FastChain, etc.), full Geant4 based simulation will still be heavily used and is expected to consume...
Detector description is an essential component in simulation, reconstruction and analysis of data resulting from particle collisions in high energy physics experiments and for the detector development studies for future experiments. Current detector description implementations of running experiments are mostly specific implementations. DD4hep is an open source toolkit created in 2012 to serve...
DD4hep is an open-source software toolkit that provides comprehensive and complete generic detector descriptions for high energy physics (HEP) detectors. The Compact Muon Solenoid collaboration (CMS) has recently evaluated and adopted DD4hep to replace its custom detector description software. CMS has demanding software requirements as a very large, long-running experiment that must support...
The ROOT TTree data format encodes hundreds of petabytes of High Energy and Nuclear Physics events. Its columnar layout drives rapid analyses, as only those parts (branches) that are really used in a given analysis need to be read from storage. Its unique feature is the seamless C++ integration, which allows users to directly store their event classes without explicitly defining data schemas....
In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to evaluate the derivative of a function specified by a computer program. AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations (addition, subtraction, multiplication, division, etc.) and elementary functions (exp, log, sin,...
Pseudo random number generation (PRNG) play an important role in many areas of computational science. Highest random properties, exact reproducibility and CPU efficiency are important requirements for using them in the most demanding Monte Carlo calculations.
We are reviewing here the highest quality PRNG available, such as those based on the Kolmogorov-Anosov theory of mixing in classical...
The Virtual Monte Carlo (VMC) package together with its concrete implementations provides a unified interface to different detector simulation transport engines such as GEANT3 or GEANT4. However, so far the simulation of one event was restricted to the usage of one chosen engine.
We introduce here the possibility to mix multiple engines within the simulation of one event. Depending on user...
The Jiangmen Underground Neutrino Observatory (JUNO) in China is a 20 kton liquid scintillator detector, designed primarily to determine the neutrino mass hierarchy, as well as to study various neutrino physics topics. Its core part consists of O(10^4) Photomultiplier Tubes (PMTs). Computations looping through this large amount of PMTs on CPU will be very time consuming. GPU parallel computing...
Neutrinos are particles that interact rarely, so identifying them requires large detectors which produce lots of data. Processing this data with the computing power available is becoming more difficult as the detectors increase in size to reach their physics goals. In liquid argon time projection chambers (TPCs) the charged particles from neutrino interactions produce ionization electrons...
At the High Luminosity Large Hadron Collider (HL-LHC), many
proton-proton collisions happen during a single bunch crossing. This
leads on average to tens of thousands of particles emerging from the
interaction region. Two major factors impede finding charged particle
trajectories from measured hits in the tracking detectors. First,
deciding whether a given set of hits was produced by a...
The Mikado approach is the winner algorithm of the final phase of the TrackML particle reconstruction challenge [1].
The algorithm is combinatorial. Its strategy is to reconstruct data in small portions, each time trying to not damage the rest of the data. The idea reminds Mikado game, where players should carefully remove wood sticks one-by-one from a heap.
The algorithm does 60...
One of the most computationally challenging problems expected for the High-Luminosity Large Hadron Collider (HL-LHC) is finding and fitting particle tracks during event reconstruction. Algorithms used at the LHC today rely on Kalman filtering, which builds physical trajectories incrementally while incorporating material effects and error estimation. Recognizing the need for faster...
Large scale neutrino detectors are relying on accurate muon energy estimates to infer neutrino energy. Reconstruction methods which incorporate physics knowledge will produce a better result. The muon energy reconstruction algorithm Edepillim takes into account the entire pattern of energy loss along the muon track and uses probability distribution functions describing muon energy losses to...
In March 2019 the Belle II detector began collecting data from $e^{+}e^{-}$ collisions at the SuperKEKB electron-positron collider. Belle II aims to collect a data sample 50 times larger than the previous generation of B-Factories. For Belle II analyses to be competitive it is crucial that calibration constants for this data are calculated promptly prior to the main data reconstruction.
To...
On March 25th 2019, the Belle II detector recorded the first collisions
delivered by the SuperKEKB accelerator. This marked the beginning of the
physics run with vertex detector.
The vertex detector was aligned initially with cosmic ray tracks without
magnetic field simultaneously with the drift chamber. The alignment
method is based on Millepede II and the General Broken Lines track...
The tracking system of Belle II consists of a silicon vertex detector (VXD) and a cylindrical drift chamber (CDC), both operating in a magnetic field created by the main solenoid of 1.5 T and final focusing magnets. The Belle II VXD is a combined tracking system composed by two layers of pixel detectors married with four layers of double sided silicon strip sensors (SVD). The drift chamber...
The increasing track multiplicity in ATLAS poses new challenges for primary vertex reconstruction software, where it is expected to reach over 70 inelastic proton-proton collisions per beam crossing during Run-2 of the LHC and even more extreme vertex density in the next upcoming Runs.
In order to address these challenges, two new tools were adapted.
The first is the Gaussian track density...
The Belle II experiment is a leading world class B-physics experiment. In 2017 BNL became a member of the Belle II collaboration taking responsibility to maintain and develop the Conditions Database (CDB)—an archive of the detector’s conditions at the time of each recorded collision. This database tracks millions of variables—for example, the detector’s level of electronic noise,...
The Run Registry of the Compact Muon Solenoid (CMS) experiment at the LHC is the central tool to keep track of the results from the data quality monitoring scrutiny carried out by the collaboration. Recently it has been upgraded for the upcoming Run3 of LHC to a new web application which will replace the current version successfully used during Run1 and Run2. It consists of a Javascript web...