- Compact style
- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
Help us make Indico better by taking this survey! Aidez-nous à améliorer Indico en répondant à ce sondage !
The LHC collaborations are pursuing searches for new physics in a vast variety of channels. While the collaborations typically provide themselves interpretations of their results, for instance in terms of simplified models, the full understanding of the implications of these searches requires the interpretation of the experimental results in the context of all kinds of theoretical models. This is a very active field, with close theory-experiment interaction and with several public tools being developed.
A Forum on the interpretation of the LHC results for BSM studies was thus initiated to discuss topics related to the BSM (re)interpretation of LHC data, including the development of the necessary public recasting tools and related infrastructure, and to and to provide a platform for a continued interaction between theorists and with the experiments.
This is the forth workshop of this Forum. Previous meetings took place
For information relative to housing and laptop registration, please check http://lpcc.web.cern.ch/about
VIDYO connection will be available.
Access to CERN: if you don't hold a valid CERN access card, make sure you request it in the registration page
Over the past years, ATLAS has developed a well defined approach to ease the reinterpretation of the results of BSM analyses. I will review the key points of this strategy, with emphasis on the auxiliary material that the collaboration is making available to the particle physics community (excluding long-lived particles which will be discussed in dedicated talks).
Study of the sensitivity of "prompt" searches to long-lived particles, discussing experimental aspects and the information required for the correct treatment in public recast tools.
Simplified Likelihoods are a convenient tool for re-interpretations of searches for new physics, which require a relatively limited set of information to be provided from the experimental collaborations. More precise re-interpretations are however possible with even further information. A proposal for an extension to the Simplified Likelihood is presented which aims to improve the level of accuracy of the Simplified Likelihood, while retaining the minimal amount of information required and simplicity of the method.
ATLAS and CMS particle-level measurements are presented: differential cross sections and Simplified Template Cross Sections (STXS). The framework and material provided for reinterpretation of results is presented, including a simple example.
I will discuss recent detector-corrected ATLAS measurements with a focus on electroweak and top interactions, their availability within HEPData and Rivet as well as some of the limitations when using the accompanying material for reinterpretation.
In this talk I will discuss a set of standard model and top analyses in CMS which have been re-interpreted for a different measurement. I will also show examples of analyses where the re-interpretation did not happen for a few reasons.
Latest results using Contur (Rivet) will be presented, along with an update of the project.
I will discuss the status and prospects of the current flavor anomalies at LHCb.
I will present an update of the program SuperIso, and discuss the interpretations of the recent anomalies observed at LHCb.
I will provide a summary of the decay capabilities of the most recent version of SoftSusy, incorporating key MSSM, NMSSM and Higgs decays and many other modes. I will also provide an indication of future directions for further developments of the program.
In this talk, we will give a brief overview of CheckMATE 2 and discuss recent developments. In particular, we want to review our current efforts to implement long lived particle searches, the addition of a new linear collider module and our attempt to include subjet structure searches. Finally, we want to discuss recent problems in implementing exotic as well as supersymmetric analyses from the ATLAS/CMS collaboration and we give some explicit examples where we have encountered some problems.
We will discuss the new functionalities of MadAnalysis v1.6, as well as recent extensions of the MadAnalysis5 Public Analysis Database for the recasting of LHC results. In particular, the new functionalities include a direct handling of signal regions definitions at the level of the normal mode of MadAnalysis5, options for dealing with event featuring both fat and normal jets and support for the LHE3 and HEPMC3 formats. Moreover, MadAnalysis can now be directly executed from within MadGraph5_aMC@NLO thanks to a new interface between the two codes. On the recasting side, we will present and discuss the implementation of new 13 TeV ATLAS and CMS analyses within the public analysis database. These analyses focus on supersymmetry, dark matter and more exotic searches. On the technical side, the recasting procedure has fully automated so that CLs numbers can be obtained straightforwardly from the event sample, and the framework has been extended to handle long-lived particles.
After the release of GAMBIT 1.0 last year, a lot of improvements have been implemented and more are in the pipeline. With GAMBIT 1.1 there came the ability to add backends in Mathematica, and very soon backends in Python will follow. Most important, a significant amount of enhancements have been done on ColliderBit, the collider module of GAMBIT. These range from the addition of new searches, namely 13 TeV analyses from the LHC, to the introduction of new features, such as simplified likelihoods, dynamic event generation convergence, etc.
SModelS is a tool that allows for a systematic application of the LHC's simplified models results to an arbitrary BSM model. In this talk I shall discuss recent and future developments of the code as well as the database of SMS results, with a focus on the new statistical treatment of correlated signal regions.
I will give a short summary how non-minimal supersymmetric models can be studied nowadays with a comparable precision as the MSSM or NMSSM. The setup is based on the Mathematica package. SARAH generates fully automatically a modified SPheno version for a given model which calculates for instance the Higgs masses at the two-loop level and which makes prediction for the most important flavour and precision observables. Also an interface to HiggsBounds/HiggsSignals exists. The obtained spectrum files can be used together with other outputs of SARAH to perform Monte-Carlo or dark matter studies using well established tools.
BSM-AI and SUSY-AI learn with the help of a range of Machine Learning tools the exclusion contours and model likelihoods for arbitrary high-parametric (beyond 2 parameters) models for new physics. We discuss various examples and use cases. Furthermore we show a database to store model evaluations for the HEP phenomenology community.
A common goal in reinterpretations is the determination of iso-hyper-surfaces with constant values of e.g. a likelihood, a test statistic such as CLs (for example, in two dimensions the iso-surface is the contour CLs==0.05). Traditionally cartesian grids are used in two dimensions but the cost of generating Monte Carlo Events for each point makes this approach prohibitive in higher dimensions.
We present a iterative algorithm based on Bayesian Optimization and Gaussian Processed that samples the parameter space by continuously incorporating information of evaluations from prior points in parameter space to determine the optimal next point to evaluate (e.g. generate Monte Carlo events for) and present results significant savings in computational resources.
LHC Experiments, especially ATLAS, use the popular HistFactory package bundled with ROOT to define statistical models based on histogram templates and perform statistical tests on those models, such as interval estimation and limit setting. In order to facilitate the usage of such experiment-grade likelihood models outside of the collaboration, we present a standalone implementation of HistFactory based purely on widely used python packages (scipy, numpy), that allows recasters to quickly calculate figures of merit such as upper limits and CLs baned (including expected bands). Further the implementation also comes with implementatinos for popular modern Tensor Backends (such as PyTorch, TensorFlow) popular in the Machine Learning Community. These backends will allow of the usage of GPU accelarations as well as auto-differentiation.
We scrutinize the allowed parameter space of Little Higgs models with the concrete symmetry of T-parity by providing comprehensive LHC analyses of all relevant production channels of heavy vectors, top partners, heavy quarks and heavy leptons and all phenomenologically relevant decay channels. Constraints on the model will be derived from the signatures of jets and missing energy or leptons and missing energy by using the collider phenomenology tool CheckMATE which exploits all available LHC BSM searches at center-of-mass energies of 8 and 13 TeV. Besides the symmetric case, we also study the case of T-parity violation.
Four-top quarks processes are a relatively common signature of BSM theories, typically arising from the pair-production of strongly-coupled new physics fields. We present the recasting of a recent CMS search for four top quarks with same-sign lepton final state and its implementation in the MadAnalysis5 PAD framework. As an application, we derive new bounds on octet (pseudo-)scalar fields.
It is well known that the interpretation of experimental data depends on the theoretical model used for it. I will show a few examples of signals that are quite elusive and whose obvious interpretation, should any of them appear in data, could be completely wrong.
We present the current perspectives for SUSY, given the public Run 2 results, in a phenomenological Minimal Supersymmetric Standard Model scenario with eleven parameters (pMSSM11).
The study, from the MasterCode collaboration, includes the most important limits on SUSY coming from searches at the runs 1 and 2 of the LHC, as well as the compatibility with the observed Higgs signal, constraints coming from electroweak precision data and flavor physics. Cosmological data and direct searches for dark matter are also taken into account. Particular attention has been given to the impact of the constraint coming from the current measurement of anomalous magnetic moment of the muon.
A light singlino is a promising candidate for dark matter, and a light higgsino is natural in the parameter space of the NMSSM. We study constraints from recent searches for electroweakinos at the LHC on scenarios which are consistent with the dark matter relic density and bounds on dark matter direct detection cross sections. Benchmark points for future searches are proposed.
I will discuss first results of recasting early LHC Run-2 electroweak SUSY searches with GAMBIT, the Global and Modular Beyond-the-Standard Inference Tool. I will review the searches that have been recasted for this analysis and their specific implementations within the GAMBIT ColliderBit framework. Assuming a model with decoupled sfermions and gluinos, I will present combined constraints on the electroweak SUSY parameter space and highlight the impact of signal region correlations on the recast results.
PhenoData (as a part of the HEPMDB project) is intended to centrally and effectively store data from HEP papers which do not provide public data elsewhere, in order to avoid duplication of work of HEP researchers on digitizing plots. It also allows to store information about recasting codes and analysis related to the respective HEP paper. This database has an easy search interface and paper identification via arXiv, DOI or preprint numbers, in addition to an API for batch upload/download of data. It has been extended recently with the option to perform comprehensive searches including collider signature and recast code availability.
We will give an update on developments of the experiment-internal analysis preservation and reusability efforts. We present the latest results of experiment-internal reinterpretations in part derived through the RECAST framework, an integration of both the CheckMate2 and MadAnalysis analysis catalogues. Further, we will discuss progress on generic analysis preservation with examples from CMS and ATLAS within the CERN Analysis Preservation (CAP) project and a new integration of CAP with the analysis re-execution platform REANA, which is planned to serve as a common backend to both CAP and RECAST.
"CutLang" software package contains a domain specific language that aims to provide a clear, human readable way to define HEP analyses, and an interpretation framework of that language. A proof of principle (PoP) implementation of the CutLang interpreter, achieved using C++ as a layer over the CERN data analysis framework ROOT, is presently available. This PoP implementation permits writing HEP analyses as a set of commands in human readable text files, which are interpreted by the framework at runtime. Initial experience with CutLang has shown that a just-in-time interpretation of a human readable HEP specific language is a practical alternative to analysis writing using compiled languages such as C++.The main features of the CutLang language and its interpreter will be presented in two educational analysis examples.
LHC result reinterpretation typically requires more information than provided in the result publication. It includes detector response effects, acceptances of the different steps of the event selections needed to validate the analysis emulation, and possibly complements on the event selection and analysis procedure. A standard to unambiguously describe an LHC analysis and provides all needed information was proposed two years ago. In the last two years, we have moved from a concept to concrete tools. I will present the use of the Les Houches analysis description accord to present LHC results, the automatic generation of analysis reinterpretation code, and the validation of the analysis description. This approach will be compared with the alternative ones.
LLP workshop timetable at https://indico.cern.ch/event/714087/timetable/
LLP workshop timetable at https://indico.cern.ch/event/714087/timetable/
The Delphes package offers a generic simulation of a detector in high energy physics. The simulation process consists in a mixture between parametric and algorithmic simulations and provides a good comprise between the transfer function method and a full detector simulation. In this talk, we will show the last Delphes performance results and explain step-by-step how LLPs are processed in the Delphes work-flow. We will focus on the LLP experimental signatures reproducible by this kind of simulation and explain the difficulties met for the very exotic signatures.
The recasting of LLP searches with the MadAnalysis 5 framework uses a special tune of Delphes which adds new features to the original package. The content of the tune and a concrete recast example will be also covered in this talk.
Work is currently ongoing to expand the ColliderBit module in GAMBIT to allow recasting of LHC searches for long-lived particles. We are aiming to implement both quick recasts, using maps of acceptance times efficiency as functions of the LLP mass and lifetime, and more detailed simulations. I will report on our progress so far, both on the implementation of required physics calculations in GAMBIT and on some structural updates of the ColliderBit code, and I will discuss our plans for the further development.
LLP workshop timetable at https://indico.cern.ch/event/714087/timetable/
LLP workshop timetable at https://indico.cern.ch/event/714087/timetable/
SModelS provides an efficient framework to reinterpret new physics searches at the LHC in arbitrary theories with Z_2-odd BSM sectors by a decomposition into simpified model topologies. We show how the signatures of heavy stable charged particles and R-hadrons are embedded in this framework. We discuss recasting of this exotic search and the generation of the simplified model database. Finally, we present two exemplary applications: supersymmetry with a gravitino LSP and the inert doublet model.
Long-lived particles appearing in models in which dark matter is produced via the freeze-in mechanism can be probed at the LHC. This is illustrated for the case of a long-lived charged fermion which decays into dark matter and a lepton (electron or muon), using a search for heavy stable charged particles and a displaced lepton search by the CMS collaboration.
Taking the scotogenic FIMP model as an example I discuss LHC signatures which arise in models with dark matter freeze-in. The small couplings required to reproduce the observed dark matter abundance translate into decay-lengths for the next-to-lightest dark sector particle which can be macroscopic, potentially leading to spectacular signatures at the LHC. I present the leading experimental signatures of the model and discuss how we can obtain limits by recasting LHC searches for long-lived particles.