The ATLAS collaboration, using 139 fb$^{-1}$ of 13 TeV collisions from the Large Hadron Collider, has placed limits on the decay of a $Z$ boson to three dark photons. We reproduce the results of the ATLAS analysis, and then recast it as a limit on a exotic Higgs decay mode, in which the Higgs boson decays via a pair of intermediate (pseudo)scalars $a$ to four dark photons $V$ (or some other...
We consider a simple Dark Matter model containing both a scalar and a vector mediator. The model provides a framework for generating the mediators and Dark Matter masses while maintaining a small number of free parameters. Using simplified model results from missing energy and resonance searches, we discuss how well this model is probed by current searches and the interplay between the...
This talk will give an overview of the CMS statistical model format (Combine datacards) and demonstrate their evaluation with Combine. In addition the review process automated in gitlab-ci will be highlighted.
As agreed in the CERN Open Data policy, all LHC experiments are committed to releasing research-quality open data. CMS has pioneered this effort and now celebrates a decade of regular data releases, with all LHC Run 1 data available in the public domain and ongoing releases of Run 2 data. This talk will provide an opportunity for the audience to reflect on what makes event-level open data...
For the development of modern event generators, the comparison to data is an invaluable tool for the tuning and validation of the code. I will review efforts where the comparisons to data from past colliders have allowed for improvements and extensions of event generators, with an emphasis on photoproduction and hard diffraction at HERA.
This contribution will describe ongoing efforts to provide event generation output to the broader community. There are a number of advantages that such a project could offer: reduced waste, easier project uptake, better validation, and improved communication between the experimental and phenomenological communities, among others.
Experiment analysis frameworks, physics data formats, and the expectations of LHC scientists have evolved towards including interactive analysis with short turnaround times and the possibility to optimize reproducible and re-interpretable workflows.
The CERN IT's Pilot Analysis Facility, the CERN Virtual Research Environment, and REANA have emerged as key solutions, as well as a platform...
One of the objectives of the EOSC (European Open Science Cloud) Future Project was to integrate diverse analysis workflows from Cosmology, Astrophysics and High Energy Physics in a common framework. This led to the inception of the Virtual Research Environment (VRE) at CERN, a prototype platform supporting the goals of Dark Matter and Extreme Universe Science Projects in compliance with FAIR...
The complexity of modern high-energy physics (HEP) experiments demands robust, flexible, and interoperable tools for statistical modeling. The HEP Statistics Serialization Standard (HS³) addresses this need by providing a unified framework for serializing statistical models and datasets in HEP research that allows to seamlessly switch between different implementations and modeling frameworks....
I will present the reinterpretation material of the CalRatio + X ATLAS analysis (arXiv:2407.09183). The analysis focuses on neutral long-lived particles decaying within the ATLAS hadronic calorimeter. The reinterpretation involves a Boosted Decision Tree (BDT) trained on truth-level variables to estimate the probability of events within the ABCD plane and assess the sensitivity of the...
This talk will cover the analysis preservation efforts in CMS, including BSM analysis implementations in REANA and possibly experience with HEPData
Traditional approaches to precise Standard Model (SM) measurements of fundamental particles at the LHC generally restrict the format of these measurements to just one or two properties at a time in predetermined histogram bins. The ATLAS Experiment recently published such a measurement in a notable new format for LHC experiments: high-dimensional and unbinned datasets that can be used for a...
As machine learning becomes increasingly embedded in ATLAS analyses, ensuring the long-term usability of ML-based results poses several challenges. This talk will examine different aspects of ML preservation, including storing and documenting trained models, handling evolving software dependencies, and ensuring accessibility of input features tied to detector conditions. We will discuss the...
This talk summarizes recommendations and status of efforts towards preservation of ML models (and accompanying validation material) in CMS, in particular in view of reuse for BSM (re)interpretations outside the collaboration.
In recent years, LHC experiments are increasingly publishing the full likelihood models for statistical analysis in HEPData. For the purporse of reproducing and reinterpreting such analyses, it is essential to understand the contents of these statistical workspaces. As part of the validation process for an ATLAS combination of searches for Beyond-Standard-Model particles, I have developed an...
ATLAS and CMS have conducted two pairs of broadly equivalent analyses, and find excesses in all four cases. This consists of searches for soft leptons (originally in supersymmetric models with compressed spectra) and monojets. I will describe recent work on trying to interpret these excesses, and quantify their significance, in terms of both supersymmetric and non-supersymmetric models, and to...
The landscape of frameworks for (re)implementing HEP analyses for reinterpretation is, fortunately, diverse. One advantage of this diversity is that multiple reimplementations of the same HEP analysis in different frameworks can be cross-validated. On the downside, statistical combinations of analyses across different frameworks must carefully
avoid double-counting events, which could...
Following on from discussions at previous reinterpretation forums on the reinterpretation of ML-based searches; we look at how one of these (ATLAS-SUSY-2018-30) has been used as a test-bed for studies of ML-based reweighting, which can reduce the computational load of studying large parameter spaces. Some new features of YODA 2 which allowed the project to run in Rivet will also be highlighted.
An update on Contur (Contraints On New Theories Using Rivet) developments
We present recent developments in the Analysis Description Language (ADL) and the runtime interpreter CutLang in the context of (re)interpretation studies. The talk will cover ongoing validation efforts from LHC BSM analyses and various improvements to the infrastructure to accommodate analysis implementation and validation requirements. We will also present studies using ATLAS open data....
ColliderBit is the GAMBIT module responsible for simulating LHC physics in the GAMBIT global fitting tool. In this talk we will present recent technical developments in ColliderBit, including new LHC analysis implemented, improvements to three-body decay kinematics relevant for supersymmetry searches, as well as updates to interfaces to other codes.
I will discuss recent developments and new features in CheckMATE, in particular multibin signal regions, new searches using machine learning methods and (perhaps) statistical combination of searches.
SModelS is a public tool for fast reinterpretation of LHC searches for new physics based on a large database of simplified model results. Version 3 includes a new framework based on a description of arbitrary simplified model topologies as directed graphs. This new development allows the tool to go beyond Z2-preserving topologies, including results from resonance searches, R-Parity violating...
The Rivet framework is widely used for analysis preservation and Monte Carlo validation, featuring around 2000 analysis routines, and is primarily used in BSM reinterpretation through the measurement-focused Contur method and tool. However, it is much less established in preservation of BSM searches, despite possessing the essential features such as detector "smearing" of physics objects, and...
While advancements in software development practices across particle physics and adoption of Linux container technology have made substantial impact in the ease of replicability and reuse of analysis software stacks, the underlying software environments are still primarily bespoke builds that lack a full manifest to ensure reproducibility across time. The [HEP Packaging...
Statistically significant di-photon excess at 95 GeV (>3sigma) and 152 GeV (>4sigma) have been observed. Furthermore, strong tensions exist between the SM predictions and measurements of W and top-like signatures at the LHC. I discuss the status of these anomalies and detail the reinterpretation studies done on the relevant ATLAS searches to assess how the excesses can be related to new...
We probe the trilinear R-parity violating (RPV) supersymmetric (SUSY) scenarios with specific non-zero interactions in the light of neutrino oscillation, Higgs, and flavor observables. We attempt to fit the set of observables using a state-of-the-art Markov Chain Monte Carlo (MCMC) set-up and study its impact on the model parameter space. Our main objective is to constrain the trilinear...