-
Xiaocong Ai (DESY)Track 2: Data Analysis - Algorithms and ToolsPoster
Computing centres, including those used to process High-Energy Physics data and simulations, are increasingly providing significant fractions of their computing resources using hardware architectures other than x86 CPUs, with GPUs being a commonly available alternative. GPUs can provide excellent computational performance at a good price point for tasks that can be suitably parallelized....
Go to contribution page -
Aryan RoyTrack 1: Computing Technology for Physics ResearchPoster
Analysis on HEP data is an iterative process in which the results of one step often inform the next. In an exploratory analysis, it is common to perform one computation on a collection of events, then view the results (often with histograms) to decide what to try next. Awkward Array is a Scikit-HEP Python package that enables data analysis with array-at-a-time operations to implement cuts as...
Go to contribution page -
Xiaomei Zhang (Chinese Academy of Sciences (CN)), Dr Yang Yifan (Institute of High Enery Physics)Track 1: Computing Technology for Physics ResearchPoster
In the near future, many new high energy physics (HEP) experiments with challenging data volume are coming into operations or are planned in IHEP, China. The DIRAC-based distributed computing system has been set up to support these experiments. To get a better utilization of available distributed computing resources, it's important to provide experimental users with handy tools for the...
Go to contribution page -
Michele Piero Blago (CERN)Track 2: Data Analysis - Algorithms and ToolsPoster
The use of Ring Imaging Cherenkov detectors (RICH) offers a powerful technique for identifying the particle species in particle physics. These detectors produce 2D images formed by rings of individual photons superimposed on a background of photon rings from other particles.
The RICH particle identification (PID) is essential to the LHCb experiment at CERN. While the current PID algorithm...
Go to contribution page -
Dr Tao Lin (Chinese Academy of Sciences (CN))Track 2: Data Analysis - Algorithms and ToolsPoster
The Jiangmen Underground Neutrino Observatory (JUNO) is designed to determine the neutrino mass ordering and precisely measure oscillation parameters. It is under construction at a depth of 700~m underground and comprises a central detector, water Cherenkov detector and top tracker. The central detector is designed to detect anti-neutrinos with an energy resolution of 3\% at 1~MeV, using a 20...
Go to contribution page -
He LiTrack 2: Data Analysis - Algorithms and ToolsPoster
A geometry management system (GMS) is designed for the Offline Software
Go to contribution page
of Super Tau Charm Facility (STCF) in China. Based on the eXtensible Markup Language
(XML) and Detector Description Toolkit for High Energy Physics Experiments (DD4Hep) ,
the system provides a consistent detector-geometry description for different offline applications,
such as simulation, reconstruction and... -
Federico FornariTrack 1: Computing Technology for Physics ResearchPoster
Modern datacenters need distributed filesystems to provide user applications with access to data stored on a large number of nodes. The ability to mount a distributed filesystem and leverage its native application programming interfaces in a Docker container, combined with the advanced orchestration features provided by Kubernetes, can improve flexibility in installing, monitoring and...
Go to contribution page -
Alan Malta Rodrigues (University of Nebraska Lincoln (US)), Daniele Spiga (Universita e INFN, Perugia (IT)), Tommaso Boccali (INFN Sezione di Pisa)Track 1: Computing Technology for Physics ResearchPoster
CMS software stack (CMSSW) is being built on a nightly basis for multiple hardware architectures and compilers, in order to benefit from the diverse platforms. In practice, still, only x86_64 is used in production, and is supported by design by the workload management tools in charge of production and analysis job delivery to the distributed computing infrastructure.
Go to contribution page
Profiting from an INFN... -
Federico FornariTrack 1: Computing Technology for Physics ResearchPoster
In the present work the possibility to exploit EOS, an open-source storage software solution for multi-PB storage management at CERN Large Hadron Collider, has been investigated in order to deploy a distributed filesystem over a storage backend provided by CEPH, an open-source software platform capable to expose data through interfaces for object, block and posix-compliant storage.
Go to contribution page
The work... -
Sascha Daniel Diefenbacher (Hamburg University (DE))Track 2: Data Analysis - Algorithms and ToolsPoster
One of the largest strains on computational resources in the field of high energy physics are Monte Carlo simulations. Given that this already high computational cost is expected to increase in the high-precision era of the LHC and at future colliders, fast surrogate simulators are urgently needed. Generative machine learning models offer a promising way to provide such a fast simulation by...
Go to contribution page -
Alexander Rogachev (National Research University Higher School of Economics (RU), Yandex School of Data Analysis (RU))Track 2: Data Analysis - Algorithms and ToolsPoster
High energy physics experiments essentially rely on the simulation data used for physics analyses. However, running detailed simulation models requires tremendous amount of computation resources. New approaches to speed up detector simulation are therefore needed. \
Go to contribution page
Generation of calorimeter responses is often the most expensive component of the simulation chain for HEP experiments.
It has... -
Adrian Alan Pol (CERN)Track 2: Data Analysis - Algorithms and ToolsPoster
In this contribution, we apply deep learning object detection techniques based on convolutional blocks to jet identification and reconstruction problem encountered at the CERN Large Hadron Collider. Particles reconstructed through the Particle Flow algorithm can be represented as an image composed of calorimeter and tracker cells as an input to a Single Shot Detection network. The algorithm,...
Go to contribution page -
Mr Andreas Pappas (National and Kapodistrian University of Athens (GR))Track 2: Data Analysis - Algorithms and ToolsPoster
The LHCb detector is undergoing a comprehensive upgrade for data taking in the LHC’s Run 3, which is scheduled to begin in 2022. The new Run 3 detector has a different, upgraded geometry and uses new tools for its description, namely DD4hep and ROOT. Besides, the visualization technologies have evolved quite a lot since Run 1, with the introduction of ubiquitous web based solutions or...
Go to contribution page -
Ludwig Albert Jaffe (Goethe University Frankfurt (DE)), Alexander Adler (Goethe University Frankfurt (DE))Track 1: Computing Technology for Physics ResearchPoster
Containerisation is an elementary tool for sharing IT resources: It is more light-weight than full virtualisation, but offers comparable isolation. We argue that for many use-cases which are typically approached with standard containerisation tools, less than full isolation is sufficient: Sometimes, only networking or only storage or both need to be different from their native, unisolated...
Go to contribution page -
Xiaocong Ai (DESY)Track 2: Data Analysis - Algorithms and ToolsPoster
Exploring anomalous objects from beyond standard model (BSM) signatures is one important mission of the LHC experiments. Recently, new particles in the sub-GeV scale have received more and more attention. The light pseudo-scalar such as axion-like particles (ALPs) and light scalar such as dark Higgs are proposed by many BSM models and can be taken as mediators of some sub-GeV dark matter...
Go to contribution page -
Huw Haigh (Austrian Academy of Sciences (AT))Track 2: Data Analysis - Algorithms and ToolsPoster
In this talk, we present the novel implementation of a non-differentiable metric approximation with a corresponding loss-scheduling based on the minimization of a figure-of-merit related function typical of particle physics (the so-called Punzi figure of merit). We call this new loss-scheduling a "Punzi-loss function" and the neural network that minimizes it a "Punzi-net". We tested the...
Go to contribution page -
Marco Rossi (CERN), Sofia Vallecorsa (CERN)Track 2: Data Analysis - Algorithms and ToolsPoster
DUNE is a cutting edge experiment aiming to study neutrinos in detail, with a
Go to contribution page
special focus on the flavor oscillation mechanism. ProtoDUNE-SP (the prototype
of the DUNE Far detector Single Phase TPC), has been built and operated at CERN
and a full suite of reconstruction tools have been developed. Pandora is a
multi-algorithm framework that implements reconstructions tools: a large number... -
Adam Abed Abud (University of Liverpool (GB) and CERN)Track 2: Data Analysis - Algorithms and ToolsPoster
Deep Learning (DL) methods and Computer Vision are becoming important tools for event reconstruction in particle physics detectors. In this work, we report on the use of Submanifold Sparse Convolutional Neural Networks (SparseNet) for the classification of track and shower hits from a DUNE prototype liquid-argon detector at CERN (ProtoDUNE). By taking advantage of the three-dimensional nature...
Go to contribution page -
Kaixuan Huang (SUN YAT-SEN UNIVERSITY)Track 1: Computing Technology for Physics ResearchPoster
In High Energy Physics (HEP) experiments, it is useful for physics analysis and outreach if the event display software can provide fancy visualization effect. Unity is a professional software that can provide 3D modeling and animation production. GDML format files are commonly used for detector description in HEP experiments. In this work, we present a method for automating the import of GDML...
Go to contribution page -
Stefano Piacentini (Università La Sapienza)Track 2: Data Analysis - Algorithms and ToolsPoster
In this contribution we will show an innovative approach based on Bayesian networks and linear algebra providing a solid and complete solution to the problem of the detector response and the related systematic effects. As a case study, we will consider the Dark Matter (DM) direct detection searches. In fact, in the past decades, a huge experimental effort has been developed to ...
Go to contribution page
Choose timezone
Your profile timezone: