The LHCb Collaboration has been using ROOT in its software framework from the beginning, in more or less efficient way. Here summarize the evolution of our use of ROOT to then glimpse into the possible paths in view of LHC Run4 and LHCb Upgrade 2.
For a few years, INFN has been developing R&D projects in the context of CMS high-rate analyses in view of the challenges of the HL-LHC phase, starting in 2030. Since then, several studies have been implemented and as a result a prototype for an analysis facility has been proposed, in compliance with CMS analysis frameworks (e.g. based on Coffea, ROOT RDataFrame), and also adopting open-source...
Can we teach an LLM to plot experimental HEP data? Modern particle physics workflows increasingly depend on a complex software ecosystem that connects large datasets, distributed data delivery, and user-level analysis tools. We demonstrate how a Large Language Model (LLM) can act as a coding assistant that bridges these components. Starting from a high-level user requestโsuch as โplot jet...
The NDMSPC project introduces an innovative THnSparse analysis framework for high-dimensional data, directly addressing the challenge of fitting N-dimensional histograms within memory. Our solution organizes the THnSparse structure into sub-chunks, optimizing memory efficiency and enabling scalable processing. Beyond numerical values, the framework allows bin content to be arbitrary objects,...
LHCbโs data analysis model is evolving rapidly in preparation for Run 5 and the HL-LHC era. This talk will outline how LHCb analysts perform end-to-end analyses, from data production to final physics results, and how these workflows differ from those of the general-purpose detectors. We will discuss the increasing demands on scalability, interoperability, and usability of analysis tools,...
CMS is developing FlashSim, a machine learningโbased framework that produces analysis-level (NANOAOD) events directly from generator-level inputs, reducing simulation costs by orders of magnitude. Efficient integration of preprocessing, inference, and output is essential, and ROOT RDataFrame provides the backbone of this workflow.
Certain operations required for FlashSim are not yet part of...
Analyzing HL-LHC heavy-ion collision data with ALICE
Victor Gonzalez, Wayne State University (US),
on behalf of the ALICE Collaboration
The ALICE detector has been taking data at heavy-ion HL-LHC regime since the start of the LHC Run 3 Pb--Pb campaign in October 2023. Recording Pb$-$Pb collisions at 50 kHz and pp collisions up to 1 MHz interaction rates without trigger results in...
We present a new, modular framework for the processing of test beam data, and in particular for the R&D programme of future timing detectors.
Based on a C++ architecture, it aims to normalise workflows for the analysis of detector performances through the definition of standard and user-defined analyses (e.g. time discrimination algorithms, intrinsic time resolution, or inter-channel...
This contribution highlights practical uses of ROOT and JSROOT for data exploration and web-based visualization. We show performance improvements achieved by replacing matplotlib with JSROOT in the CalibView app for the CMS PPS project. A key feature is the partial reading of ROOT files and handling plot data as JSON objects.
We also present a lightweight approach using static websites...
CAR T-cell therapy has revolutionized cancer immunotherapy by reprogramming patient T cells to recognize and eliminate malignant cells, achieving remarkable success in hematologic cancers. However, its application to solid tumors remains challenging due to the complexity and heterogeneity of the tumor microenvironment. CARTopiaX is an advanced agent-based model developed on BioDynaMo, an...
The Deep Underground Neutrino Experiment (DUNE) will deploy four 10 kt
fiducial mass time projection chambers in order to study accelerator neutrinos,supernova neutrinos, beyond the standard model physics, atmospheric neutrinos,and solar neutrinos. Reconstructing data in varying time domains over the nearly 400,000 channels needed to monitor this volume presents the complex challenge of...