Because the cross section of dark matter is very small compared to that of the Standard Model (SM), huge amount of simulation is required [1]. Hence, to optimize Central Processing Unit (CPU) time is crucial to increase the efficiency of dark matter research in HEP. In this work, the CPU time was studied using the MadGraph5 as a simulation toolkit for dark matter study at e+e- colliders. The...
Surrogate modeling and data-model convergence are important in any field utilizing probabilistic modeling, including High Energy Physics and Nuclear Physics. However, demonstrating that the model produces samples from the same underlying distribution as the true source can be problematic if the data is many-dimensional. The 1-D and multi-dimensional Kolmogorov-Smirnov test (ddKS) is a...
Abstract
The Large Hadron Collider’s third run poses new and interesting problems
that all experiments have to tackle in order to fully exploit the
benefits provided by the new architecture, such as the increase in the
amount of data to be recorded.
As part of the new developments that are taking place in the ALICE
experiment, payloads that use more than a single processing...
Solenoidal Tracker at RHIC (STAR) is a multipurpose experiment at the Relativistic Heavy Ion Collider (RHIC) with the primary goal to study formation and properties of the quark-gluon plasma. STAR is an international collaboration of member institutions and laboratories from around the world. Yearly data-taking period produces PBytes of raw data collected by the experiment. STAR primarily uses...
Collecting, storing and processing of experimental data are an integral part of modern high-energy physics experiments. Various experiment databases and corresponding information systems related to their use and support play an important role and, in many ways, combine online and offline data processing. One of them, the Configuration Database is an essential part of a complex of information...
NICA (Nuclotron-based Ion Collider fAсility) is a new accelerator complex, which is under construction at the Joint Institute for Nuclear Research in Dubna to study properties of dense baryonic matter. The experiments of the NICA projects have already generated and obtained substantial volumes of event data, and it is expected that the overall number of stored events will increase from the...
A difficult aspect of cyber security is the ability to achieve automated real time intrusion prevention across various sets of systems. To this extent, several companies are offering comprehensive solutions that leverage an “accuracy of scale” and moving much of the intelligence and detection on the Cloud, relying on an ever-growing set of data and analytics to increase decision accuracy....
Abstract. The ALICE Experiment at CERN’s Large Hadron Collider is undertaking a major upgrade during Long Shutdown 2 in 2019-2021, which includes a new Online-Offline computing system. To ensure the efficient operation of the upgraded experiment, and of its newly designed computing system, a new set of reliable and performant graphical interfaces is needed. These are to be used 24h/365d in...
Modern calorimeters for High Energy Physics (HEP) have very fine transverse and longitudinal segmentation to manage high incoming flux and improve particle identification capabilities. Compared to older calorimeter designs, this change alone alters the extraction of the number and energy of incident particles on the device from a simple gaussian-template clustering problem to a highly...
Particle tracking is a challenging pattern recognition task in experimental particle physics. Traditional algorithms based on Kalman filters show desirable performance in finding tracks originating from collision points. However, for displaced tracks, dedicated tunings are often required in order to reach sensible performance as the quality of the seed for the Kalman filter has a direct impact...
The ATLAS Technical Coordination Expert System is a knowledge-based application describing and simulating the ATLAS infrastructure, its components, and their relationships, in order to facilitate the sharing of knowledge, improve the communication among experts, and foresee potential consequences of interventions and failures. The developed software is key for planning ahead of the future...
The hls4ml project started to bring Neural Network inference to the L1 trigger system of the LHC experiments. Since its initial proposal, the library has grown, integrating support for multiple backends, multiple network architectures (convolutional, recurrent, graph), extreme quantization (binary and ternary networks), and multiple applications (classification, regression, anomaly detection)....
There has been significant development recently in generative models for accelerating LHC simulations. Work on simulating jets has primarily used image-based representations, which tend to be sparse and of limited resolution. We advocate for the more natural 'particle cloud' representation of jets, i.e. as a set of particles in momentum space, and discuss four physics- and...
HEP experiments heavily rely on the production and the storage of large datasets of simulated events. At the LHC, simulation workflows require about half of the available computing resources of a typical experiment. With the foreseen High Luminosity LHC upgrade, data volume and complexity are going to increase faster than the expected improvements in computing infrastructure. Speeding up the...
Recent developments in software to address challenges in the High-Luminosity LHC (HL-LHC) era allow novel approaches when interacting with the data and performing physics analysis. We employed software components primarily from IRIS-HEP to construct an analysis workflow of an ongoing ATLAS Run-2 physics analysis in the python ecosystem. The software components in the analysis workflow include...
Novel cryogenic scintillating calorimeters, used in rare event search experiments, achieve sub-keV recoil energy thresholds. Such low thresholds require a sensible raw data analysis of triggered events. This includes the identification of particle recoils among artifacts, and the reconstruction of the corresponding recoil energies, despite a low signal-to-noise ratio. For this purpose we...