Conveners
Plenary Session Tuesday
- Graeme A Stewart (CERN)
- Oksana Shadura (University of Nebraska Lincoln (US))
Plenary Session Tuesday
- Graeme A Stewart (CERN)
- Eduardo Rodrigues (University of Liverpool (GB))
Creating, manipulating, editing and validating detector or accelerator geometry for Monte Carlo codes such a Geant4, FLUKA, MCNP and PHITS is a time consuming and error prone process. Diverse tools for achieving typical work flows are available but rarely under a single coherent package. Pyg4ometry is a python based code to manipulate geometry, mainly for Geant4 but also FLUKA and soon MCNP...
During this talk I will present our experiences executing analysis workflows on thousands of cores. We use TaskVine, a general-purpose task scheduler for large scale data intensive dynamic python applications, to execute the task graph generated by Coffea+Dask. As task data becomes available, TaskVine adapts the cores and memory allocated to maximize throughput and minimize retries....
A configuration layer for the analysis of CMS data in the NanoAOD format is presented. The framework is based on the columnar analysis of proton-proton collision events with the Coffea Python package and it focuses on configurability and reproducibility of analysis tasks.
All the operations needed to extract the relevant information from events are performed by a Coffea processor object...
With the increasing dataset sizes brought by the current LHC data analysis workflows and the future expectations that estimate even greater computational needs, data analysis software must strive to optimise the processing throughput on a single core and ensure an efficient distribution of tasks across multiple cores and computing nodes. RDataFrame, the high-level interface for data analysis...
Reinforcement Learning (RL) is becoming one of the most effective paradigms of Machine Learning for training autonomous systems.
In the context of astronomical observational campaigns, this paradigm can be used for training autonomous telescopes able to optimize sequential schedules based on a given scientific reward, avoiding the intervention of manual optimization which may result in...
We present the pymcabc software which is a High Energy Physics toy toolkit for the ABC model. The ABC model is a pedagogical model that consists of three scalar particles of arbitrary masses. The only interaction among these particles occurs when all three of them are present together. The pymcabc software can calculate all the leading-order cross-sections as well as decay widths within the...
RooFit is a C++ library for statistical modelling and analysis. It is part of
the ROOT framework and also provides Python bindings. RooFit provides some
basic building blocks to construct probability density functions for data
modelling. However, in some application areas, the analytical physics-driven
shapes for data modelling have become so complicated that they can't be covered
by a...
This research introduces an automated system capable of efficiently simulating, translating, and analyzing high-energy physics (HEP) data. By leveraging HEP data simulation software, computer clusters, and cutting-edge machine learning algorithms, such as convolutional neural networks (CNNs) and autoencoders, the system effectively manages a dataset of approximately 10,000 entries.
Using the...