-
Kihong Park (Korea Institute of Science and Technology Information (KISTI)), Kihyeon ChoTrack 1: Computing Technology for Physics ResearchPoster
Because the cross section of dark matter is very small compared to that of the Standard Model (SM), huge amount of simulation is required [1]. Hence, to optimize Central Processing Unit (CPU) time is crucial to increase the efficiency of dark matter research in HEP. In this work, the CPU time was studied using the MadGraph5 as a simulation toolkit for dark matter study at e+e- colliders. The...
-
619. Accelerated Computation of a High Dimensional Kolmogorov-Smirnov Distance (contribution ID 619)Dr Shane Jackson (PNNL)Track 2: Data Analysis - Algorithms and ToolsPoster
Surrogate modeling and data-model convergence are important in any field utilizing probabilistic modeling, including High Energy Physics and Nuclear Physics. However, demonstrating that the model produces samples from the same underlying distribution as the true source can be problematic if the data is many-dimensional. The 1-D and multi-dimensional Kolmogorov-Smirnov test (ddKS) is a...
-
Mr Sergiu Weisz (University Politehnica of Bucharest (RO))Track 1: Computing Technology for Physics ResearchPoster
Abstract
The Large Hadron Collider’s third run poses new and interesting problems
that all experiments have to tackle in order to fully exploit the
benefits provided by the new architecture, such as the increase in the
amount of data to be recorded.As part of the new developments that are taking place in the ALICE
experiment, payloads that use more than a single processing... -
Irakli Chakaberia (Lawrence Berkeley National Lab. (US)), Irakli Chakaberia (Lawrence Berkeley National Lab. (US))Track 1: Computing Technology for Physics ResearchPoster
Solenoidal Tracker at RHIC (STAR) is a multipurpose experiment at the Relativistic Heavy Ion Collider (RHIC) with the primary goal to study formation and properties of the quark-gluon plasma. STAR is an international collaboration of member institutions and laboratories from around the world. Yearly data-taking period produces PBytes of raw data collected by the experiment. STAR primarily uses...
-
Dr Igor Alexandrov (Joint Institute for Nuclear Research (RU))Track 1: Computing Technology for Physics ResearchPoster
Collecting, storing and processing of experimental data are an integral part of modern high-energy physics experiments. Various experiment databases and corresponding information systems related to their use and support play an important role and, in many ways, combine online and offline data processing. One of them, the Configuration Database is an essential part of a complex of information...
-
Peter Klimai (Moscow Institute of Physics and Technology (MIPT))Track 1: Computing Technology for Physics ResearchPoster
NICA (Nuclotron-based Ion Collider fAсility) is a new accelerator complex, which is under construction at the Joint Institute for Nuclear Research in Dubna to study properties of dense baryonic matter. The experiments of the NICA projects have already generated and obtained substantial volumes of event data, and it is expected that the overall number of stored events will increase from the...
-
Michael Poat (Brookhaven National Laboratory)Track 1: Computing Technology for Physics ResearchPoster
A difficult aspect of cyber security is the ability to achieve automated real time intrusion prevention across various sets of systems. To this extent, several companies are offering comprehensive solutions that leverage an “accuracy of scale” and moving much of the intelligence and detection on the Cloud, relying on an ever-growing set of data and analytics to increase decision accuracy....
-
George Raduta (CERN)Track 1: Computing Technology for Physics ResearchPoster
Abstract. The ALICE Experiment at CERN’s Large Hadron Collider is undertaking a major upgrade during Long Shutdown 2 in 2019-2021, which includes a new Online-Offline computing system. To ensure the efficient operation of the upgraded experiment, and of its newly designed computing system, a new set of reliable and performant graphical interfaces is needed. These are to be used 24h/365d in...
-
CMS Collaboration, Thomas Klijnsma (Fermi National Accelerator Lab. (US))Track 2: Data Analysis - Algorithms and ToolsPoster
Modern calorimeters for High Energy Physics (HEP) have very fine transverse and longitudinal segmentation to manage high incoming flux and improve particle identification capabilities. Compared to older calorimeter designs, this change alone alters the extraction of the number and energy of incident particles on the device from a simple gaussian-template clustering problem to a highly...
-
Xiangyang Ju (Lawrence Berkeley National Lab. (US)), Daniel Thomas Murnane (Lawrence Berkeley National Lab. (US)), Chun-Yi Wang (National Tsing Hua University (TW))Track 2: Data Analysis - Algorithms and ToolsPoster
Particle tracking is a challenging pattern recognition task in experimental particle physics. Traditional algorithms based on Kalman filters show desirable performance in finding tracks originating from collision points. However, for displaced tracks, dedicated tunings are often required in order to reach sensible performance as the quality of the seed for the Kalman filter has a direct impact...
-
Gustavo Uribe (Universidad Antonio Narino (CO))Track 1: Computing Technology for Physics ResearchPoster
The ATLAS Technical Coordination Expert System is a knowledge-based application describing and simulating the ATLAS infrastructure, its components, and their relationships, in order to facilitate the sharing of knowledge, improve the communication among experts, and foresee potential consequences of interventions and failures. The developed software is key for planning ahead of the future...
-
Vladimir Loncar (CERN)Track 2: Data Analysis - Algorithms and ToolsPoster
The hls4ml project started to bring Neural Network inference to the L1 trigger system of the LHC experiments. Since its initial proposal, the library has grown, integrating support for multiple backends, multiple network architectures (convolutional, recurrent, graph), extreme quantization (binary and ternary networks), and multiple applications (classification, regression, anomaly detection)....
-
Raghav Kansal (Univ. of California San Diego (US))Track 2: Data Analysis - Algorithms and ToolsPoster
There has been significant development recently in generative models for accelerating LHC simulations. Work on simulating jets has primarily used image-based representations, which tend to be sparse and of limited resolution. We advocate for the more natural 'particle cloud' representation of jets, i.e. as a set of particles in momentum space, and discuss four physics- and...
-
Mary Touranakou (National and Kapodistrian University of Athens (GR)), Breno Orzari (UNESP - Universidade Estadual Paulista (BR))Track 2: Data Analysis - Algorithms and ToolsPoster
HEP experiments heavily rely on the production and the storage of large datasets of simulated events. At the LHC, simulation workflows require about half of the available computing resources of a typical experiment. With the foreseen High Luminosity LHC upgrade, data volume and complexity are going to increase faster than the expected improvements in computing infrastructure. Speeding up the...
-
Kyungeon Choi (University of Texas at Austin (US))Track 1: Computing Technology for Physics ResearchPoster
Recent developments in software to address challenges in the High-Luminosity LHC (HL-LHC) era allow novel approaches when interacting with the data and performing physics analysis. We employed software components primarily from IRIS-HEP to construct an analysis workflow of an ongoing ATLAS Run-2 physics analysis in the python ecosystem. The software components in the analysis workflow include...
-
Felix Wagner (HEPHY Vienna)Track 2: Data Analysis - Algorithms and ToolsPoster
Novel cryogenic scintillating calorimeters, used in rare event search experiments, achieve sub-keV recoil energy thresholds. Such low thresholds require a sensible raw data analysis of triggered events. This includes the identification of particle recoils among artifacts, and the reconstruction of the corresponding recoil energies, despite a low signal-to-noise ratio. For this purpose we...
Choose timezone
Your profile timezone: