-
Danilo Piparo (CERN)14/05/2024, 09:00Talk
In this contribution, weโll review the current status of the ROOT project, characterising its structure, available effort and strategic goals. Weโll explain how in the recent years the energy flowing from the open source community changed ROOT and boosted the development, materialising in the form of code, reports, ideas and proposals. Weโll review the recently integrated features that are key...
Go to contribution page -
Eduardo Rodrigues (University of Liverpool (GB))14/05/2024, 09:25
Scikit-HEP is a community-driven and community-oriented project with the goal of providing an ecosystem for particle physics data analysis in Python fully integrated with the wider scientific Python ecosystem. The project started in Autumn 2016 and has evolved into a toolset of approximately thirty packages and a few โaffiliatedโ packages.
Go to contribution page
It expands the typical Python data analysis tools... -
Thomas Madlener (Deutsches Elektronen-Synchrotron (DESY))14/05/2024, 09:50Talk
Providing and maintaining the necessary tools for studying and developing detectors for future colliders is non trivial. On the one hand it requires a substantially sized software stack with all complications arising therefrom. On the other hand the available person power is usually strongly limited. In order to tackle both the Key4hep project aims at providing a complete software stack that...
Go to contribution page -
David Lange (Princeton University (US))14/05/2024, 11:15Talk
The HSF-India initiative, which aims to implement new and impactful research software collaborations between India, Europe and the United States. The intent of this project is to increase the engagement of software experts in Asia with the HSF community. The starting point of this collaboration is a series of software workshops focused on building software skills. These workshops are the basis...
Go to contribution page -
Tobias Fitschen (University of Manchester (GB))14/05/2024, 11:35Talk
A number of analyses and performance groups in ATLAS use an analysis framework, written in C++ with python steering files, called xAODAnaHelpers (xAH). xAH is used to loop on events a variety of ATLAS analysis data formats, by using central software to calibrate, select and correct physics objects. xAH has been chosen as one of the EVERSE (European Virtual Institute for Research Software...
Go to contribution page -
Leif Lรถnnblad (Lund University (SE))14/05/2024, 11:55Talk
I will describe the current status of the Pyhtia8 project and some future developments that we are working with. I will also describe services offered by the Pythia8 collaboration such as on-line tutorials, and our GitLab help desk.
Go to contribution page -
Edward Moyse (University of Massachusetts (US))14/05/2024, 12:20Talk
Phoenix is a TypeScript-based event display framework, created in response to the 2017 HSF community white paper.
It uses industry standard web tools (such as the popular three.js library for 3D rendering), and runs entirely in the client's web browser. It is experiment agnostic by design, providing shared common functionality (such as custom menus, controls, propagators) but also has...
Go to contribution page -
Andrii Verbytskyi (Max Planck Society (DE))14/05/2024, 16:00Talk
HepMC3 is a library developed to handle the simulated collision events from Monte Carlo event Generators in High Energy Physics. The library is a successor in spirit of the earlier HepMC library and incorporated multiple ideas which appeared in the recent decade in the HEP community.
This contribution discusses in detail the recent developments of the HepMC3 project, the relation of HepMC3...
Go to contribution page -
Dr Luke Pickering (Royal Holloway, University of London)14/05/2024, 16:20Talk
Simulations of neutrino interactions are playing an increasingly important role in the pursuit of high-priority measurements for the field of particle physics. A significant technical barrier for efficient development of these simulations is the lack of a standard data format for representing individual neutrino scattering events. We propose and define such a universal format, named NuHepMC,...
Go to contribution page -
Lino Oscar Gerlach (Brookhaven National Laboratory (US))14/05/2024, 16:40Talk
Conditions data is the subset of non-event data that is necessary to process event data. It poses a unique set of challenges, namely a heterogeneous structure and high access rates by distributed computing. As these challenges are similar across various High Energy Physics (HEP) and Nuclear Physics (NP) experiments, the HEP Software Foundation (HSF) hosted a forum to discuss and share...
Go to contribution page -
Adam Morris (CERN)14/05/2024, 17:00Talk
Gaussino is an experiment-independent simulation package built upon the Gaudi software framework. It provides generic core components and interfaces for a complete HEP simulation application: event generation, detector simulation, geometry, monitoring and output of the simulated data. The generator interface allows for a wide variety of external event generator packages to be used, with an...
Go to contribution page -
Danilo Piparo (CERN)15/05/2024, 11:15Talk
An intense collaborative work is ongoing about the development and testing of RNTuple, the future HEP columnar storage software technology, involving LHC experiments, DUNE and the ROOT team.
In this contribution weโll review the status of the plan of work of RNTuple, towards the freezing of the specification at the end of the year. Weโll review the new features of RNTuple, as well as the...
Go to contribution page -
James Smith (University of Manchester (GB))15/05/2024, 11:35Talk
The storage, transmission and processing of data is a major challenge across many fields of physics and industry. Traditional generic data compression techniques are lossless, but are limited in performance and require additional computation.
BALER [1,2] is an open-source autoencoder-based framework for the development of tailored lossy data compression models suitable for data from...
Go to contribution page -
David Martin Koch (Ludwig Maximilians Universitat (DE))15/05/2024, 11:55Talk
A fast turn-around time and ease of use are important factors for systems supporting the analysis of large HEP data samples. We study and compare multiple technical approaches.
Go to contribution page
This presentation will be about setting up and benchmarking the Analysis Grand Challenge (AGC) [1] using CMS Open Data. The AGC is an effort to provide a realistic physics analysis with the intent of showcasing the... -
Benjamin Galewsky (Univ. Illinois at Urbana Champaign (US))15/05/2024, 12:15Talk
Cloud data lake technologies have been used successfully in industry for analysis of exabyte scale datasets. The technologies that underly this architecture are
- Object Store
- Parquet file format
- Kubernetes
- Distributed SQL
We will describe our work using a Trino distributed SQL engine to join selected event data with inference results. We will show how this architecture can...
Go to contribution page -
Alexander Moreno Briceรฑo (Universidad Antonio Nariรฑo)15/05/2024, 16:00Talk
-
Francesca Calegari15/05/2024, 16:10Talk
-
Jamie Gooding15/05/2024, 16:20Talk
-
Angela Warkentin15/05/2024, 16:30Talk
-
Valeriia Lukashenko15/05/2024, 16:40Talk
-
Stefan Roiser (CERN)15/05/2024, 16:50
-
15/05/2024, 17:00
-
Jonathan Butterworth (UCL)16/05/2024, 09:00Talk
Contur (Constraints On New Theories Using Rivet) is a public python package sitting on top of Rivet and Yoda, which allows information on new BSM models to be extracted from particle-level differential cross section measurements from the LHC. BSM events simulated by a general-purpose MC event generator are "signal injected" into the fiducial phase space of hundreds of measurements...
Go to contribution page -
Lorenz Gรคrtner (LMU)16/05/2024, 09:20Talk
Experimental High Energy Physics has entered an era of precision measurements. However, measurements of many of the accessible processes assume that the final states' underlying kinematic distribution is the same as the Standard Model prediction. This assumption introduces an implicit model-dependency into the measurement, rendering the reinterpretation of the experimental analysis complicated...
Go to contribution page -
Jonas Eschle (Syracuse University (US))16/05/2024, 09:40Talk
The Python HEP analysis ecosystem and its user base grew significantly in the last few years, and with it the need for advanced statistical inference tools involving likelihood fits; a core part of most analyses in HEP.
Go to contribution page
zfit started over five years ago with the goal to provide this capability, a library for model fitting in HEP: scalable - in terms of model building complexity and performance... -
Patrick Stowell (University of Sheffield), Patrick Stowell16/05/2024, 10:00Talk
NUISANCE is a neutrino event generator prediction comparison and tuning framework. It facilitates cross-section predictions for the five main event generators in use by the few-GeV neutrino scattering community, enabling non-expert users to compare predictions to over 350 neutrino cross-section measurements, from the historical to the cutting edge.
We are currently in the process of...
Go to contribution page -
Christian Gutschow (UCL (UK))16/05/2024, 11:15Talk
In the age of GPU-accelerated event generation, pivotal community tools like HepMC and Rivet, vital for event generation infrastructure and Monte Carlo event analysis, risk becoming significant bottlenecks in the near future.
We present an adaptable and highly efficient approach to simulating collider events featuring multi-jet final states, encompassing both leading and next-to-leading...
Go to contribution page -
Max Knobbe (University of Gรถttingen)16/05/2024, 11:35Talk
High-precision calculations are crucial for the success of the LHC physics programme. However, the rising computational complexity for high-multiplicity final states is threatening to become a limiting bottleneck in the coming years. At the same time, the rapid deployment of non-traditional GPU-based computing hardware in data centres around the world demands an overhaul of the event generator...
Go to contribution page -
Arthur Hennequin (CERN)16/05/2024, 11:55Talk
Since 2022, the LHCb detector is taking data with a full software trigger at the LHC proton-proton collision rate, implemented in GPUs in the first stage and CPUs in the second stage. This setup allows to perform the alignment & calibration online and to perform physics analyses directly on the output of the online reconstruction, following the real-time analysis paradigm.
Go to contribution page
This talk will... -
Attila Krasznahorkay (CERN)16/05/2024, 12:15Talk
Reconstructing the tracks left by charged particles in modern HEP detectors is one of the most computationally challenging tasks in analyzing the data of modern experiments. During the High-Luminosity LHC era the LHC experiments, including ATLAS, will have to be able to process much more complex data at much higher rates than ever before.
To achieve this, GPU accelerated code has been...
Go to contribution page
Choose timezone
Your profile timezone: