14–16 May 2018
CERN
Europe/Zurich timezone

Session

Methods & Tools

15 May 2018, 09:00
500/1-001 - Main Auditorium (CERN)

500/1-001 - Main Auditorium

CERN

400
Show room on map

Presentation materials

There are no materials yet.

  1. Jong Soo Kim
    15/05/2018, 09:00
    Methods & tools

    In this talk, we will give a brief overview of CheckMATE 2 and discuss recent developments. In particular, we want to review our current efforts to implement long lived particle searches, the addition of a new linear collider module and our attempt to include subjet structure searches. Finally, we want to discuss recent problems in implementing exotic as well as supersymmetric analyses from...

    Go to contribution page
  2. Dr Guillaume Chalons (LPSC)
    15/05/2018, 09:20
    Methods & tools

    We will discuss the new functionalities of MadAnalysis v1.6, as well as recent extensions of the MadAnalysis5 Public Analysis Database for the recasting of LHC results. In particular, the new functionalities include a direct handling of signal regions definitions at the level of the normal mode of MadAnalysis5, options for dealing with event featuring both fat and normal jets and support for...

    Go to contribution page
  3. Tomas Gonzalo (University of Oslo)
    15/05/2018, 09:40
    Methods & tools

    After the release of GAMBIT 1.0 last year, a lot of improvements have been implemented and more are in the pipeline. With GAMBIT 1.1 there came the ability to add backends in Mathematica, and very soon backends in Python will follow. Most important, a significant amount of enhancements have been done on ColliderBit, the collider module of GAMBIT. These range from the addition of new searches,...

    Go to contribution page
  4. Dr Wolfgang Waltenberger (Austrian Academy of Sciences (AT))
    15/05/2018, 10:00
    Methods & tools

    SModelS is a tool that allows for a systematic application of the LHC's simplified models results to an arbitrary BSM model. In this talk I shall discuss recent and future developments of the code as well as the database of SMS results, with a focus on the new statistical treatment of correlated signal regions.

    Go to contribution page
  5. Florian Staub (KIT - Karlsruhe Institute of Technology (DE))
    15/05/2018, 11:00
    Methods & tools

    I will give a short summary how non-minimal supersymmetric models can be studied nowadays with a comparable precision as the MSSM or NMSSM. The setup is based on the Mathematica package. SARAH generates fully automatically a modified SPheno version for a given model which calculates for instance the Higgs masses at the two-loop level and which makes prediction for the most important flavour...

    Go to contribution page
  6. Sascha Caron (Nikhef National institute for subatomic physics (NL))
    15/05/2018, 11:20
    Methods & tools

    BSM-AI and SUSY-AI learn with the help of a range of Machine Learning tools the exclusion contours and model likelihoods for arbitrary high-parametric (beyond 2 parameters) models for new physics. We discuss various examples and use cases. Furthermore we show a database to store model evaluations for the HEP phenomenology community.

    Go to contribution page
  7. Lukas Alexander Heinrich (New York University (US))
    15/05/2018, 11:40
    Methods & tools

    A common goal in reinterpretations is the determination of iso-hyper-surfaces with constant values of e.g. a likelihood, a test statistic such as CLs (for example, in two dimensions the iso-surface is the contour CLs==0.05). Traditionally cartesian grids are used in two dimensions but the cost of generating Monte Carlo Events for each point makes this approach prohibitive in higher...

    Go to contribution page
  8. Lukas Alexander Heinrich (New York University (US))
    15/05/2018, 11:50
    Methods & tools

    LHC Experiments, especially ATLAS, use the popular HistFactory package bundled with ROOT to define statistical models based on histogram templates and perform statistical tests on those models, such as interval estimation and limit setting. In order to facilitate the usage of such experiment-grade likelihood models outside of the collaboration, we present a standalone implementation of...

    Go to contribution page
  9. 15/05/2018, 12:05
  10. Daniel Locke (University of Southampton)
    15/05/2018, 16:30
    Methods & tools

    PhenoData (as a part of the HEPMDB project) is intended to centrally and effectively store data from HEP papers which do not provide public data elsewhere, in order to avoid duplication of work of HEP researchers on digitizing plots. It also allows to store information about recasting codes and analysis related to the respective HEP paper. This database has an easy search interface and paper...

    Go to contribution page
  11. Lukas Alexander Heinrich (New York University (US))
    15/05/2018, 16:50
    (Re)interpretation studies

    We will give an update on developments of the experiment-internal analysis preservation and reusability efforts. We present the latest results of experiment-internal reinterpretations in part derived through the RECAST framework, an integration of both the CheckMate2 and MadAnalysis analysis catalogues. Further, we will discuss progress on generic analysis preservation with examples from CMS...

    Go to contribution page
  12. Gokhan Unel (University of California Irvine (US))
    15/05/2018, 17:10
    Methods & tools

    "CutLang" software package contains a domain specific language that aims to provide a clear, human readable way to define HEP analyses, and an interpretation framework of that language. A proof of principle (PoP) implementation of the CutLang interpreter, achieved using C++ as a layer over the CERN data analysis framework ROOT, is presently available. This PoP implementation permits writing...

    Go to contribution page
  13. Philippe Gras (Université Paris-Saclay (FR))
    15/05/2018, 17:30
    Methods & tools

    LHC result reinterpretation typically requires more information than provided in the result publication. It includes detector response effects, acceptances of the different steps of the event selections needed to validate the analysis emulation, and possibly complements on the event selection and analysis procedure. A standard to unambiguously describe an LHC analysis and provides all needed...

    Go to contribution page
Building timetable...