It is usually difficult to describe the non-uniformity of the liquid in a detector because the fixed method is used to construct the geometry in detector simulations such as Geant4. We propose a method based on geometry description markup language and a tessellated detector description to share the detector geometry information between computational fluid dynamics simulation software and...
The increased use of accelerators for scientific computing, together with the increased variety of hardware involved, induces a need for performance portability between at least CPUs (which largely dominate WLCG infrastructure) and GPUs (which are quickly emerging as an architecture of choice for online data processing and HPC centers). In the C/C++ community, OpenCL was a low level first...
In the past decade, Data and Analysis Preservation (DAP) has
gained an increased prominence in the scope of effort of major
High Energy and Nuclear Physics (HEP/NP) experiments, driven
by the policies of the funding agencies as well as realization
of the benefits brought by DAP to the science output of many
projects in the field. It is a complex domain which in addition to
archival of...
HENP experiments are preparing for HL-LHC era, which will bring an unprecedented volume of scientific data. This data will need to be stored and processed by collaborations, but expected resource growth is nowhere near extrapolated requirements of existing models both in storage volume and compute power. In this report, we will focus on building a prototype of a distributed data processing and...
The installation and maintenance of scientific software for research in
experimental, phenomenological, and theoretical High Energy Physics (HEP)
requires a considerable amount of time and expertise. While many tools are
available to make the task of installation and maintenance much easier,
many of these tools require maintenance on their own, have little
documentation and very few...
Foreseen increasing demand for simulations of particle transport through detectors in High Energy Physics motivated the search for faster alternatives to Monte Carlo based simulations. Deep learning approaches provide promising results in terms of speed up and accuracy, among which generative adversarial networks (GANs) appear to be the most successful in reproducing realistic detector data....
The main computing and storage facility of INFN (Italian Institute for Nuclear Physics) running at CNAF hosts and manages tens of Petabytes of data produced by the LHC (Large Hadron Collider) experiments at CERN and other scientific collaborations in which INFN is involved. The majority of these data are stored on tape resources of different technologies.
All the tape drives can be used for...
Particle accelerators are an important tool to study the fundamental properties of elementary particles. Currently the highest energy accelerator is the LHC at CERN, in Geneva, Switzerland. Each of its four major detectors, such as the CMS detector, produces dozens of Petabytes of data per year to be analyzed by a large international collaboration. The processing is carried out on the...
Over the next decade, the ATLAS experiment will be required to operate in an increasingly harsh collision environment. To maintain physics performance, the ATLAS experiment will undergo a series of upgrades during major shutdowns. A key goal of these upgrades is to improve the capacity and flexibility of the detector readout system. To this end, the Front-End Link eXchange (FELIX) system was...
This talk introduces and shows the simulated performance of two FPGA-based techniques to improve fast track finding in the ATLAS trigger. A fast hardware based track trigger is being developed in ATLAS for the High Luminosity upgrade of the Large Hadron Collider (HL-LHC), the goal of which is to provide the high-level trigger with full-scan tracking at 100 kHz in the high pile-up conditions of...
We present the package for the simulation of DM (Dark Matter) particles in fixed target experiments. The most convenient way
of this simulation (and the only possible way in the case of beam-dump) is to simulate it in the framework of the
Monte-Carlo program performing the particle tracing in the experimental setup.
The Geant4 toolkit framework was chosen as the most popular and versatile...
We introduce a novel method for identifying fractions of primary air shower particles in an ensemble of events using deep learning. The suggested approach is developed for the Monte-Carlo simulated data for the Telescope Array experiment. For a given hadronic model, the error of identifying individual fractions of primary particles in an ensemble is less than 7%. We show that the developed...
We investigate the possibility of using Deep Learning algorithms for jet identification in the L1 trigger at HL-LHC. We perform a survey of architectures (MLP, CNN, Graph Networks) and benchmark their performance and resource consumption on FPGAs using a QKeras+hls4ml compression-aware training procedure. We use the HLS4ML jet dataset to compare the results obtained in this study to previous...
The baseline track finding algorithms adopted in the LHC experiments are based on combinatorial track following techniques, where the seed number scales non-linearly with the number of hits. The corresponding CPU time increase, close to cubical, creates huge and ever-increasing demand for computing power. This is particularly problematic for the silicon tracking detectors, where the hit...
Modeling network data traffic is the most important task in the design and construction of new network centers and campus networks. The results of the analysis of models can be applied in the reorganization of existing centers and in the configuration of data routing protocols based on the use of links. The paper shows how constant monitoring of the main directions of data transfer allows...
Tau leptons are used in a range of important ATLAS physics analyses, including the measurement of the SM Higgs boson coupling to fermions, searches for Higgs boson partners, and heavy resonances decaying into pairs of tau leptons. Events for these analyses are provided by a number of single and di-tau triggers including event topological requirements or the requirement of additional objects at...
The Mu2e experiment at Fermilab searches for the charged-lepton flavor violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. If no events are observed, in three years of running Mu2e will improve the previous upper limit by four orders of magnitude in search sensitivity.
Mu2e’s Trigger and Data Acquisition System (TDAQ) uses {\it otsdaq}...
We present a new version of the Monte Carlo event generator ReneSANCe. The generator takes into account complete one-loop electroweak (EW) corrections, QED corrections in leading log approximation (LLA) and some higher order QED and EW corrections to processes at e^+e^- colliders with finite particle masses and arbitrary polarizations of intitial particles. ReneSANCe effectively operates in...
The Compact Muon Solenoid (CMS) detector at the Large Hadron Collider (LHC) is undertaking a Phase II upgrade program to face the harsh conditions imposed by the High Luminosity LHC (HL-LHC). This program comprises the installation of a new timing layer to measure the time of minimum ionizing particles (MIPs) with a time resolution of 30-40 ps. The time information of the tracks from this new...
The Jiangmen Underground Neutrino Observatory (JUNO), currently under construction in the south of China, is the largest Liquid Scintillator (LS) detector in the world. JUNO is a multipurpose neutrino experiment designed to determine neutrino mass ordering, precisely measure oscillation parameters, and study solar neutrinos, supernova neutrinos, geo-neutrinos and atmospheric neutrinos. The...