As agreed in the CERN Open Data policy, all LHC experiments are committed to releasing research-quality open data. CMS has pioneered this effort and now celebrates a decade of regular data releases, with all LHC Run 1 data available in the public domain and ongoing releases of Run 2 data. This talk will provide an opportunity for the audience to reflect on what makes event-level open data...
For the development of modern event generators, the comparison to data is an invaluable tool for the tuning and validation of the code. I will review efforts where the comparisons to data from past colliders have allowed for improvements and extensions of event generators, with an emphasis on photoproduction and hard diffraction at HERA.
This contribution will describe ongoing efforts to provide event generation output to the broader community. There are a number of advantages that such a project could offer: reduced waste, easier project uptake, better validation, and improved communication between the experimental and phenomenological communities, among others.
Experiment analysis frameworks, physics data formats, and the expectations of LHC scientists have evolved towards including interactive analysis with short turnaround times and the possibility to optimize reproducible and re-interpretable workflows.
The CERN IT's Pilot Analysis Facility, the CERN Virtual Research Environment, and REANA have emerged as key solutions, as well as a platform...
One of the objectives of the EOSC (European Open Science Cloud) Future Project was to integrate diverse analysis workflows from Cosmology, Astrophysics and High Energy Physics in a common framework. This led to the inception of the Virtual Research Environment (VRE) at CERN, a prototype platform supporting the goals of Dark Matter and Extreme Universe Science Projects in compliance with FAIR...
The complexity of modern high-energy physics (HEP) experiments demands robust, flexible, and interoperable tools for statistical modeling. The HEP Statistics Serialization Standard (HS³) addresses this need by providing a unified framework for serializing statistical models and datasets in HEP research that allows to seamlessly switch between different implementations and modeling frameworks....
I will present the reinterpretation material of the CalRatio + X ATLAS analysis (arXiv:2407.09183). The analysis focuses on neutral long-lived particles decaying within the ATLAS hadronic calorimeter. The reinterpretation involves a Boosted Decision Tree (BDT) trained on truth-level variables to estimate the probability of events within the ABCD plane and assess the sensitivity of the...