We introduce the Z'-explorer software, a new tool to probe Z' models using available visible decay channels at the LHC. This tool scrutinizes the parameter space of a model to determine which part is still allowed, which will soon be proved, and which channel is the most sensitive in each region of parameter space. The user does not need to implement the model nor run any Monte Carlo...
In this contribution, we report about the latest developments in MadAnalysis 5 relevant for recasting studies. The software is now equipped with its own fast detector simulator (called the MA5-SFS framework) based on efficiency and smearing functions, offering thus users an option different from and more light-weight than Delphes 3 to deal with the detector effects. Implementations of 4 Run 2...
SModelS is an automatized tool for the fast reinterpretation of LHC searches for theories beyond the standard model (BSM) using a large database of simplified model results. Until recently SModelS could only handle simplified models describing prompt decays, thus limiting its applicability to prompt MET searches and searches for stable charged particles. Version 2.0 generalizes the SModelS...
A huge amount of effort and person-power goes into searching for evidence of beyond-the-SM (BSM) theories at the LHC. A search may take a large team over a year to produce, and even then may only focus on the modelโs most spectacular signature. But many BSM theories could probably already be ruled out because they would have caused measurable distortions to well-understood spectra of...
The proposition that it is in our scientific interest to embrace a more open posture with respect to data generated by a publicly funded science such as particle physics is gaining traction. In this talk, after a brief overview of likelihood functions, I explain why it is in our scientific interest to make the publication of full likelihoods routine and straightforward. I briefly describe the...
Full likelihoods encode the entire statistical model of an analysis and thus range among the most invaluable analysis data products for a large range of analyses, ranging from SM measurements to BSM searches. ATLAS has recently started to release the first full analysis likelihoods using a python-based implementation of HistFactory. In this talk, the JSON specification used to release the...
The CMS collaboration aims to provide high-quality analysis results to the particle physics community. The rich search program for beyond-the-standard-model phenomena in various topologies is an excellent basis for reinterpretation by external scientists. This talk will discuss the steps taken by the collaboration in order to facilitate such reinterpretations, in particular focusing on the...
In order to maximise the sensitivity to parameters of interest (e.g. Wilson Coefficients), theoretical uncertainties can be constrained using data itself. This builds in theoretical assumptions into measurements. Reinterpreting such measurements is a crucial component in order to prevent them from becoming obsolete with theoretical progress. In this talk, I will discuss the challenges and...
Models where dark matter is a part of an electroweak multiplet feature charged particles with macroscopic lifetimes due to the charged-neutral mass split of the order of pion mass. We have reinterpreted the latest ATLAS disappearing track search for models with DM of different spins: inert Two Higgs Doublet, Minimal Fermion Dark Matter and Vector Triplet Dark Matter models. We have found...
The CMS disappearing track search using the full Run 2 data (2004.05153) has the potential to place the strongest limits on many scenarios, such as minimal dark matter and especially supersymmetric models where the electroweakino mass splittings become small enough the the charginos are long-lived. I shall report on progress producing codes to recast this analysis.
I will discuss the current status and open issues for recasting the 13 TeV 36 fb-1 ATLAS disappearing track search which is documented at arXiv:1712.02118.
The talk will focus on the recent LLP searches at the LHC, with an emphasis on the ones that are not already well known by the reinterpretation community (ie not focusing on the disappearing tracks or displaced Inner Detector vertex analyses). Besides discussing the analyses and signatures covered, the talk will cover which auxiliary material is available for reinterpretation if any, ...
Comparing models with modified Higgs sectors to the available measurements and limits is a complicated task due to the large number of relevant experimental results. Codes such as HiggsBounds and HiggsSignals compare any model prediction to the experimental results from scalar searches and measurements. Interpreting the model-independent experimental results in specific new physics scenarios...
Precision measurements in Higgs/Top and other SM processes are extremely valuable for constraining physics scenarios beyond the Standard Model. Interpretations of these measurements in terms of Effective field theories are increasingly popular. In addition to the dedicate EFT measurements produced by ATLAS and CMS, precision measurements can be Re-interpreted under EFTs / fed into global-fits...
I will give an overview of the theoretical aspects that are relevant for the interpretation of LHC data in an Effective Field Theory framework.
In particular, I will summarize the points concerning EFT formalism and prediction tools that are currently under discussion within the LHC EFT Working Group.
The indirect effects of heavy new physics can be characterised by higher-dimensional operators in the Standard Model Effective Field Theory (SMEFT). In this talk I will present a recent global fit of dimension-6 operators to top, Higgs, diboson and electroweak precision data using the Fitmaker tool created for this purpose.
Direct detection experiments search for dark matter from the galactic halo interacting on earth.
The expected rate and spectral signature of such a signal depends on assumptions of the galactic halo, the dark matter velocity distribution, the dark matter model and more.
Published experimental constraints are made for specific dark matter interaction types and dark matter masses, with most...
I'll discuss my experiences in interpreting direct detection data for non-standard signals. I'll discuss problems that I have run into in the past, the advances that have been made over the past 5 years, and suggest ways that would make the process of interpreting data easier.
I shall discuss selected topics of broad interest, that arise in the interpretation and combination of neutrino oscillation data from current and prospective experiments.
Many BSM searches in ATLAS provide auxiliary information uploaded to HEPData that details a C++ analysis code snippet for the region definitions in the analysis. Up to now, the software infrastructure for this code snippet was not provided. This meant that theorists wishing to use the HEPData information would often need to reimplement the analysis selections into their own framework. This...
Championing CERN's efforts in open science, the CMS Collaboration has released a large portion of the LHC's Run 1 data to date. In this presentation, we report on the first CMS Open Data Workshop held remotely last year, the tools used therein and the feedback obtained. In addition, the current status and plans are reviewed in adherence to the new, revitalized CERN open data policy.
The starting point for a statistically sound reinterpretation of an experimental result is the likelihood function, which encodes the probability of the observed data given an assumed model. When multiple experimental results are considered, as in global fits, a composite likelihood function is used, constructed from likelihood components for all the measurements included in the fit.
In...
The DNNLikelihood framework is presented and its main features discussed. Such framework encodes the experimental likelihood functions in deep neural networks and allows for a lightweight and platform-independent distribution of physics results through the ONNX model format. The procedure retains the full experimental information and does not rely neither on Gaussian approximation nor on...
We present a novel algorithm to identify potential dispersed signals of new physics in the slew of published LHC results. It employs a random walk algorithm to introduce sets of new particles, dubbed "protomodels", which are tested against simplified-model results from ATLAS and CMS (exploiting the SModelS software framework).
A combinatorial...
A new paradigm for data-driven, model-agnostic new physics searches at colliders is emerging, and aims to leverage recent breakthroughs in anomaly detection and machine learning. In order to develop and benchmark new anomaly detection methods within this framework, it is essential to have standard datasets. To this end, we have created the LHC Olympics 2020, a community challenge accompanied...
The type-III seesaw seems to explain the very minuteness of neutrino masses readily and naturally. The high-energy see-saw theories usually involve a larger number of effective parametres than the physical and measurable parametres appearing in the low-energy neutrino phenomenology. Casas-Ibarra parameterization facilitates to encode the information lost in integrating the heavy fermions out...
While overwhelming cosmological evidences point to the existence of Dark Matter (DM), only its gravitational interaction has been experimentally confirmed. Limitations on the most general mono-X DM signature at colliders motivate searches beyond this. This could manifest in the form of a weak multiplet/doublet DM via weak interactions giving multilepton plus missing energy final states that...
We present a calculation of the next-to-leading order QCD corrections for the scattering of Dark Matter particles off nucleons in the framework of simplified models with $s$- and $t$-channel mediators. These results are matched to the Wilson coefficients and operators of an effective field theory that is generally used for the presentation of experimental results on spin-independent and...
Supersymmetric models with Dirac gauginos, instead of Majorana, are known to have distinctive phenomenological properties. Concretely, in the Minimal Dirac Gaugino Supersymmetric Standard Model (MDGSSM) the electroweakino sector is characterized by an enriched spectrum with a total of 6 neutralinos and 4 charginos exhibiting naturally small mass splittings and thus, yielding a frequent...
The Global and Modular Beyond-the-standard-model Inference Tool (GAMBIT) is an open-source framework for performing BSM global fits. In this talk I will give update on recent and ongoing developments in GAMBIT. In particular, I will introduce the GAMBIT Universal Model Machine (GUM), to be included in the upcoming GAMBIT 2.0. GUM is a new tool for automatically generating model-specific...