-
Saverio Mariani (Universita e INFN, Firenze (IT))Track 2: Data Analysis - Algorithms and ToolsPoster
An innovative approach to particle identification (PID) analyses employing machine learning techniques and its application to a physics case from the fixed-target programme at the LHCb experiment at CERN are presented. In general, a PID classifier is built by combining the response of specialized subdetectors, exploiting different techniques to guarantee redundancy and a wide kinematic...
Go to contribution page -
Ke Li (University of Washington (US))Track 2: Data Analysis - Algorithms and ToolsPoster
ATLAS is one of the largest experiments at the Large Hadron Collider. Its broad physics program relies on very large samples of simulated events, but producing these samples is very CPU intensive when using the full GEANT4 detector simulation. A parameterization-based Fast Calorimeter Simulation, i.e. AtlFast3, is developed to replace the Geant4 simulation to meet the computing challenges....
Go to contribution page -
Oleg Kalashev (INR RAS)Track 2: Data Analysis - Algorithms and ToolsPoster
The problem of ultra-high energy cosmic ray sources identification is greatly complicated by the fact that even highest energy cosmic rays may be deflected by tens of degrees in the galactic magnetic fields. We show that arrival directions for the deflected cosmic rays from several nearest active galaxies form specific patterns in the sky, which can be effectively recognized by the...
Go to contribution page -
CMS Collaboration, Kevin Pedro (Fermi National Accelerator Lab. (US))Track 2: Data Analysis - Algorithms and ToolsPoster
The high accuracy of detector simulation is crucial for modern particle physics experiments. However, this accuracy comes with a high computational cost, which will be exacerbated by the large datasets and complex detector upgrades associated with next-generation facilities such as the High Luminosity LHC. We explore the viability of regression-based machine learning (ML) approaches using...
Go to contribution page -
Igor Pelevanyuk (Joint Institute for Nuclear Research (RU))Track 1: Computing Technology for Physics ResearchPoster
Joint Institute for Nuclear Research has several large computing facilities: Tier1 and Tier2 grid clusters, Govorun supercomputer, cloud, and LHEP computing cluster. Each of them has different access protocols, authentication and authorization procedures, data access methods. With the help of the DIRAC Interware, we were able to integrate all these resources to provide a uniform access to all...
Go to contribution page -
Vincenzo Eduardo Padulano (Valencia Polytechnic University (ES))Track 2: Data Analysis - Algorithms and ToolsPoster
The declarative approach to data analysis provides high-level abstractions for users to operate on their datasets in a much more ergonomic fashion compared to imperative interfaces. ROOT offers such a tool with RDataFrame, which creates a computation graph with the operations issued by the user and executes it lazily only when the final results are queried. It has always been oriented towards...
Go to contribution page -
Dr Nikita Kazeev (Yandex School of Data Analysis (RU))Track 2: Data Analysis - Algorithms and ToolsPoster
In recent years fully-parametric fast simulation methods based on generative models have been proposed for a variety of high-energy physics detectors. By their nature, the quality of data-driven models degrades in the regions of the phase space where the data are sparse. Since machine-learning models are hard to analyze from the physical principles, the commonly used testing procedures are...
Go to contribution page -
Simon MetayerTrack 3: Computations in Theoretical Physics: Techniques and MethodsPoster
In this talk, we shall discuss recent results for the elastic degrees of freedom of fluctuating surfaces obtained by multi-loop approaches. These surfaces are ubiquitous in physics, and are used to describe objects in various fields; from brane theory to membranes in biophysics and more recently, applied to graphene and graphene-like materials. We derive the three-loop order renormalization...
Go to contribution page -
Dr John J. Oh (NIMS (South Korea))Track 2: Data Analysis - Algorithms and ToolsPoster
The gravitational-wave detector is a very complicated and sensitive collection of advanced instru-ments, which is influenced not only by the mutual interaction between mechanical/electronics systemsbut also by the surrounding environment. Thus, it is necessary to categorize and reduce noises frommany channels interconnected by such instruments and environment for achieving the detection...
Go to contribution page -
Placido Fernandez Declara (CERN)Track 1: Computing Technology for Physics ResearchPoster
Detector optimisation and physics performance studies are an integral part of the development of future collider experiments. The Key4hep project aims to design a common set of software tools for future, or even present, High Energy Physics projects. Based on the iLCSoft and FCCSW frameworks an integrated solution for detector simulation, reconstruction and analyses is being developed. This...
Go to contribution page -
Gloria Corti (CERN), Michal Mazurek (CERN)Track 2: Data Analysis - Algorithms and ToolsPoster
The LHCb Experiment at the Large Hadron Collider (LHC) at CERN has successfully performed a large number of physics measurements during Runs 1 and 2 of the LHC. It will resume operation in Run3 with an upgraded detector to process events with up to five times higher luminosity. Monte Carlo simulations are key to the commissioning of the new detector and the interpretation of past and future...
Go to contribution page -
Andrea Valenzuela Ramirez (Universitat Oberta de Catalunya (ES))Track 1: Computing Technology for Physics ResearchPoster
The CernVM File System (CernVM-FS) is a global read-only POSIX file system that provides scalable and reliable software distribution to numerous scientific collaborations. It gives access to more than a billion binary files of experiment application software stacks and operating system containers to end user devices, grids, clouds, and supercomputers. CernVM-FS is asymmetric by construction....
Go to contribution page -
Vasileios Belis (ETH Zurich (CH))Track 2: Data Analysis - Algorithms and ToolsPoster
The advantage of quantum computers over classical devices lies in the possibility of using quantum superposition effects of n qubits to perform exponential computations in parallel. This effect makes it possible to reduce the computational complexity of certain classes of problems, such as optimisation, sampling or combinatorial problems in large scale fault-tolerant quantum...
Go to contribution page -
Enrico Guiraud (EP-SFT, CERN)Track 2: Data Analysis - Algorithms and ToolsPoster
In recent years, RDataFrame, ROOT's high-level interface for data analysis and processing, has seen widespread adoption on the part of HEP physicists. Much of this success is due to RDataFrame's ergonomic programming model that enables the implementation of common analysis tasks more easily than previous APIs, without compromising on application performance. Nonetheless, RDataFrame's...
Go to contribution page -
Javier Lopez Gomez (CERN)Track 2: Data Analysis - Algorithms and ToolsPoster
Upcoming HEP experiments, e.g. at the HL-LHC, are expected to increase the volume of generated data by at least one order of magnitude. In order to retain the ability to analyze the influx of data, full exploitation of modern storage hardware and systems, such as low-latency high-bandwidth NVMe devices and distributed object stores, becomes critical.
To this end, the ROOT RNTuple I/O...
Go to contribution page -
Aziz Temirkhanov (National Research University Higher School of Economics (RU))Track 2: Data Analysis - Algorithms and ToolsPoster
The volume of data processed by the Large Hadron Collider experiments demands sophisticated selection rules typically based on machine learning algorithms. One of the shortcomings of these approaches is their profound sensitivity to the biases in training samples. In the case of particle identification (PID), this might lead to degradation of the efficiency for some decays on validation due to...
Go to contribution page -
Michel Hernandez Villanueva (DESY)Track 1: Computing Technology for Physics ResearchPoster
Among the upgrades in current high energy physics (HEP) experiments and the new facilities coming online, solving software challenges has become integral for the success of the collaborations, and the demand for human resources highly-skilled in both HEP and software domains is increasing. With a highly distributed environment in human resources, the sustainability of the HEP ecosystem...
Go to contribution page -
Niclas Steve Eich (Rheinisch Westfaelische Tech. Hoch. (DE))Track 2: Data Analysis - Algorithms and ToolsPoster
We present a specialised layer for generative modeling of LHC events with generative adversarial networks. We use Lorentz boosts, rotations, momentum and energy conservation to build a network cell generating a 2-body particle decay. This cell is stacked consecutively in order to model two staged decays, respecting the symmetries across the decay chain. We allow for modifications of the...
Go to contribution page -
Jonas Eschle (Universitaet Zuerich (CH))Track 2: Data Analysis - Algorithms and ToolsPoster
Statistical modelling and likelihood inference is a key element in many sciences,
Go to contribution page
especially in High-Energy Physics (HEP) analyses. These require advanced features
such as handling large amounts of data, supporting binned, unbinned and mixed inference, using complicated and often custom made model functions, and being highly performant.
In HEP, these features were covered in C++ frameworks...
Choose timezone
Your profile timezone: