Conveners
Computing and Data Handling: Machine Learning 1
- Sang Un Ahn (Korea Institute of Science & Technology Information (KR))
Computing and Data Handling: Machine Learning 2 / Trigger and DAQ
- Sang Un Ahn (Korea Institute of Science & Technology Information (KR))
- Doris Yangsoo Kim (Soongsil University)
Computing and Data Handling: Detector Simulation Tools
- Elizabeth Sexton-Kennedy (Fermi National Accelerator Lab. (US))
Computing and Data Handling: Computing Model Evolutions
- Doris Yangsoo Kim (Soongsil University)
Computing and Data Handling: Reconstruction Performance for Jet/MET, Muon, and Particle ID
- Elizabeth Sexton-Kennedy (Fermi National Accelerator Lab. (US))
Machine learning is of increasing importance to high energy physics as dataset sizes and data rates grow, while sensitivity to standard model and new physics signals are continually pushed to new extremes. Machine learning has proven to be advantageous in many contexts, and applications now span areas as diverse as triggering, monitoring, reconstruction, simulation, and data analysis. This...
Machine learning is of increasing importance to high energy physics as dataset sizes and data rates grow, while sensitivity to standard model and new physics signals are continually pushed to new extremes. Machine learning has proven to be advantageous in many contexts, and applications now span areas as diverse as triggering, monitoring, reconstruction, simulation, and data analysis. This...
To attain its ultimate discovery goals, the luminosity of the Large Hadron Collider at CERN will increase so the amount of additional collisions will reach a level of 200 interaction per bunch crossing, a factor 7 w.r.t the current (2017) luminosity. This will be a challenge for the ATLAS and CMS experiments, in particular for track reconstruction algorithms. In terms of software, the...
The BESIII detector is a general purpose spectrometer located at BEPCII. BEPCII is a double ring e+e− collider running at center of mass energies between 2.0 and 4.6 GeV and has reached a peak luminosity of $1\times 10^{33} $cm$^{−2}$s$^{−1}$ at $\sqrt(s) = 3770$ MeV. As an experiment in the high precision frontier of hadron physics, since 2009, BESIII has collected the world's largest data...
Particle identification (PID) plays a crucial role in LHCb analyses. Combining information from LHCb subdetectors allows one to distinguish between various species of long-lived charged and neutral particles. PID performance directly affects the sensitivity of most LHCb measurements. Advanced multivariate approaches are used at LHCb to obtain the best PID performance and control systematic...
In HEP experiments CPU resources required by MC simulations are constantly growing and become a very large fraction of the total computing power (greater than 75%). At the same time the pace of performance improvements from technology is slowing down, so the only solution is a more efficient use of resources. Efforts are ongoing in the LHC experiments to provide multiple options for simulating...
Modeling the detector response to collisions is one of the most CPU expensive and time-consuming aspects in the LHC. The current ATLAS baseline, GEANT4, is highly CPU intensive. With the large collision dataset expected in the future, CPU usage becomes critical. During the LHC Run-1, a fast calorimeter simulation (FastCaloSim) was successfully used by ATLAS. FastCaloSim parametrizes the energy...
We apply deep learning methods to various aspects of high energy physics problems, from jet reconstruction to top quark reconstruction at hadron colliders. Various supervised and unsupervised learning method use cases and failure cases are discussed. We describe our setup to make deep learning methods easier for users who are used to analyzing with ROOT data formats.
Monitoring the quality of the data being collected by the CMS Muon system to ensure that it fulfills the requirements needed to be used for physics analyses is a time-consuming and labor-intensive task. The CMS Muon group is developing a reliable and robust tool that will make use of automated statistical tests and modern machine learning algorithms to reduce the resources needed to run and...
Google said it has reduced electricity bills by introducing machine learning into its data center operations. And what else is there?
The LHCb upgrade trigger represents a new paradigm in high energy physics: A fully software trigger operating at the LHC bunch crossing frequency with a triggerless readout. The existing level-0 hardware trigger in Run 2 has allowed us to test much of the upgrade strategy at 1MHz. In this talk, we will describe the performance of the Run 2 trigger in pp and special data taking configurations,...
The CMS experiment selects events with a two-level trigger system: the Level-1 trigger (L1) and the High Level Trigger (HLT). The HLT is a farm made of approximately 30k CPU cores that reduces the rate from 100 kHz to about 1 kHz. The HLT has access to the full detector readout and runs a dedicated online event reconstruction to select events. In 2017, LHC instantaneous luminosity during...
CUP, Center for Underground Physics, is one of the research centers belonging to Institute for Basic Science (IBS), Korea. CUP is conducting several experiments in the field of neutrinoless double beta decay, direct WIMP search, and neutrino oscillation, such as COSINE-100, AMoRE, and NEOS experiment. CUP has developed the DAQ system for these experiments including hardware and software. In...
The experimental programs planned for the next decade are driving developments in the simulation domain; they include the High Luminosity LHC project (HL-LHC), neutrino experiments, and studies towards future facilities such as Linear Colliders (ILC/CLIC) and Future Circular Colliders (FCC). The next-generation detectors being planned for long-term future programs will have increased...
Evaluated data libraries are the foundation of physics modeling in Monte Carlo particle transport codes, such as Geant4, FLUKA and MCNP, which are used in high energy and nuclear physics experiments, accelerator studies and detector development. They encompass recommended cross sections, nuclear and atomic parameters, which may derive from theoretical calculations, evaluations of experimental...
The LHCb experiment is a fully instrumented forward spectrometer designed for precision studies in the flavour sector of the standard model with proton-proton collisions at the LHC. As part of its expanding physics programme, LHCb collected data also during the LHC proton-nucleus collisions in 2013 and 2016 and during nucleus-nucleus collisions in 2015. These datasets provide access to unique...
LHCb is one of the major experiments operating at the Large Hadron Collider at CERN. The richness of the physics program and the increasing precision of the measurements in LHCb lead to the need of ever larger simulated samples. This need will increase further when the upgraded LHCb detector will start collecting data in the LHC Run 3. Given the computing resources pledged for the production...
In 2015, the LHCb experiment implemented a unique data processing model that allows for reconstructed objects created in the trigger to be persisted and analysed offline, without a loss in physics performance. This model has recently evolved such that arbitrary additional objects, in addition to those used in the trigger decision, can also be persisted. This allows for a more inclusive...
LHC experiments require significant computational resources for Monte Carlo simulations and real data processing and the ATLAS experiment is not an exception. In 2017, ATLAS exploited steadily almost 3M HS06 units, which corresponds to about 300 000 standard CPU cores. The total disk and tape capacity managed by the Rucio data management system exceeded 350 PB.
Resources are provided mostly...
In LHC Run 3, the ALICE experiment will record 100 times more events than in the runs before. This is achieved with a continuous detector readout. To cope with such a huge amount of data, a new integrated Online-Offline (O2) computing infrastructure is created. Part of this development is a new analysis framework.
In Run 1 and Run 2 a large fraction of the time to analyze a dataset has been...
HEPfit is a computational tool for the combination of indirect and direct constraints on High Energy Physics models. The code is built in a modular structure so that one can select observables and models of interest. It can be used to build customized models and customized observables. It has a statistical framework based on Markov Chain Monte Carlo (MCMC) driven Bayesian analysis. However,...
Prototype imaging electromagnetic and hadronic calorimeters developed and operated by the CALICE collaboration provide an unprecedented wealth of highly granular data of hadronic showers for a variety of active sensor elements and different absorber materials. In this presentation, we discuss the reconstruction and energy resolution of single hadrons in individual detectors and combined...
The LHCb particle identification (PID) system is composed of two ring-imaging Cherenkov detectors , a series of muon chambers and a calorimeter system. A novel strategy has been introduced in Run 2, where the selection of PID calibration samples for charged particles and neutrals is implemented in the LHCb software trigger. A further processing of the data is required in order to provide...
Jets are the experimental signatures of energetic quarks and gluons produced in high energy processes and they need to be calibrated in order to have the correct energy scale. A detailed understanding of both the energy scale and the transverse momentum resolution of jets at the CMS is of crucial importance for many physics analyses. In this talk, we present the measurements of CMS jet energy...
Accurate measurements and identification of jets, missing energy and boosted hadronic resonances are crucial to most of the ATLAS physics programme both in domain of Standard Model precision measurements and search for beyond the SM physics. The ever increasing LHC luminosity while providing higher statistical sensitivity to rare processes, also leads to more challenging experimental...
The Compact Muon Solenoid (CMS) detector is one of the two multi-purpose experiments at the Large Hadron Collider (LHC) and has a broad physics program. Many aspects of this program depend on our ability to trigger, reconstruction and identify events with final state muons in a wide range of momenta, from few GeV to the TeV scale. Displaced muons can also be used as a benchmark for new new...
The ATLAS experiment is a multi-purpose experiment installed at the Large Hadron Collider (LHC) at CERN, designed to study elementary particles and their interactions in high-energy collisions of proton and heavy ion beams.
Muon and Tau leptons play an important role in many physics processes that are being investigated at the LHC. Hadronic decays of the taus are reconstructed from the...
Since the beginning of LHC Run 2, many improvements have been made to the triggering, reconstruction, and identification of hadronic tau decays at CMS. The standard Hadron Plus Strips (HPS) tau reconstruction algorithm now benefits from a dynamic strip reconstruction. The HPS method has been extended to a version intended for highly Lorentz-boosted topologies and a version which is used in...