In this talk, we will give a brief overview of CheckMATE 2 and discuss recent developments. In particular, we want to review our current efforts to implement long lived particle searches, the addition of a new linear collider module and our attempt to include subjet structure searches. Finally, we want to discuss recent problems in implementing exotic as well as supersymmetric analyses from...
We will discuss the new functionalities of MadAnalysis v1.6, as well as recent extensions of the MadAnalysis5 Public Analysis Database for the recasting of LHC results. In particular, the new functionalities include a direct handling of signal regions definitions at the level of the normal mode of MadAnalysis5, options for dealing with event featuring both fat and normal jets and support for...
After the release of GAMBIT 1.0 last year, a lot of improvements have been implemented and more are in the pipeline. With GAMBIT 1.1 there came the ability to add backends in Mathematica, and very soon backends in Python will follow. Most important, a significant amount of enhancements have been done on ColliderBit, the collider module of GAMBIT. These range from the addition of new searches,...
SModelS is a tool that allows for a systematic application of the LHC's simplified models results to an arbitrary BSM model. In this talk I shall discuss recent and future developments of the code as well as the database of SMS results, with a focus on the new statistical treatment of correlated signal regions.
I will give a short summary how non-minimal supersymmetric models can be studied nowadays with a comparable precision as the MSSM or NMSSM. The setup is based on the Mathematica package. SARAH generates fully automatically a modified SPheno version for a given model which calculates for instance the Higgs masses at the two-loop level and which makes prediction for the most important flavour...
BSM-AI and SUSY-AI learn with the help of a range of Machine Learning tools the exclusion contours and model likelihoods for arbitrary high-parametric (beyond 2 parameters) models for new physics. We discuss various examples and use cases. Furthermore we show a database to store model evaluations for the HEP phenomenology community.
A common goal in reinterpretations is the determination of iso-hyper-surfaces with constant values of e.g. a likelihood, a test statistic such as CLs (for example, in two dimensions the iso-surface is the contour CLs==0.05). Traditionally cartesian grids are used in two dimensions but the cost of generating Monte Carlo Events for each point makes this approach prohibitive in higher...
LHC Experiments, especially ATLAS, use the popular HistFactory package bundled with ROOT to define statistical models based on histogram templates and perform statistical tests on those models, such as interval estimation and limit setting. In order to facilitate the usage of such experiment-grade likelihood models outside of the collaboration, we present a standalone implementation of...
PhenoData (as a part of the HEPMDB project) is intended to centrally and effectively store data from HEP papers which do not provide public data elsewhere, in order to avoid duplication of work of HEP researchers on digitizing plots. It also allows to store information about recasting codes and analysis related to the respective HEP paper. This database has an easy search interface and paper...
We will give an update on developments of the experiment-internal analysis preservation and reusability efforts. We present the latest results of experiment-internal reinterpretations in part derived through the RECAST framework, an integration of both the CheckMate2 and MadAnalysis analysis catalogues. Further, we will discuss progress on generic analysis preservation with examples from CMS...
"CutLang" software package contains a domain specific language that aims to provide a clear, human readable way to define HEP analyses, and an interpretation framework of that language. A proof of principle (PoP) implementation of the CutLang interpreter, achieved using C++ as a layer over the CERN data analysis framework ROOT, is presently available. This PoP implementation permits writing...
LHC result reinterpretation typically requires more information than provided in the result publication. It includes detector response effects, acceptances of the different steps of the event selections needed to validate the analysis emulation, and possibly complements on the event selection and analysis procedure. A standard to unambiguously describe an LHC analysis and provides all needed...