(Re)interpretation of the LHC results for new physics
This is the 9th general workshop of the “Forum on the interpretation of the LHC results for BSM studies”, or LHC Reinterpretation Forum (RIF) for short. Its aim is to review new developments on the tools, phenomenology, and the experimental sides, regarding the questions of data and analysis preservation and reuse.
Emphasis at this workshop will be given to current and future developments regarding analysis preservation and reuse, and recommendations of best practises, in particular with regards to the upcoming European Strategy for Particle Physics Update. Moreover, an extensive session will be devoted to recent reinterpretation studies. Cross-talks with various LHC working groups will be appreciated.
Background: The purpose of the RIF is discuss topics related to the BSM (re)interpretation of LHC data, including the development of the necessary public recasting tools and related infrastructure, and to provide a platform for a continued interaction between theorists and with the experiments. So far, eight general workshops were held at CERN, Fermilab, Imperial College London and IPPP Durham. They resulted in the 2020 report on Reinterpretation of LHC Results for New Physics: Status and recommendations after Run 2, arXiv:2003.07868, and the white paper on Publishing statistical models: Getting the most out of particle physics experiments, arXiv:2109.04981 (both published in SciPost Physics). Moreover, the experiences from this Forum heavily informed the Snowmass white paper on Data and analysis preseration, recasting and reinterpretation, arXiv:2203.10057.
The RIF is an LHC-physics initiative supported and promoted by the LPCC. This workshop marks the transition from the RIF to the LHC REI WG.
-
-
1
Welcome and introduction 4/3-006 - TH Conference RoomSpeaker: Sabine Kraml (LPSC Grenoble)
-
Reinterpretation studies (pheno) 4/3-006 - TH Conference Room
-
2
Exploring a Composite Dark Matter Model Using CONTUR and MadAnalysis5
In this study, we place constraints on a composite t-channel dark matter model using CONTUR and MadAnalysis 5. The model is similar to those currently being studied by the DM working group and is representative of composite dark matter constructions, with a scalar dark matter candidate and three VLQ partners (mediators). We investigate the exclusions obtained from the subprocesses in LO and NLO simulations using both of these frameworks.
Speaker: Clarisse Prat -
3
Probing exotic long-lived particles from the prompt side using the CONTUR method
A method to derive constraints on new physics models featuring exotic long-lived particles using detector-corrected measurements of prompt states is presented. The CONTUR workflow is modified to either account for the fraction of long-lived particles which decay early enough to be reconstructed as prompt, or to be sensitive to the recoil of such particles against a prompt system. This makes it possible to determine how many of signal events would be selected in the RIVET routines which encapsulate the fiducial regions of dozens of measurements of Standard Model processes by the ATLAS and CMS collaborations. New constraints are set on several popular exotic long-lived particle models in the very short-lifetime or very long-lifetime regimes, which are often poorly covered by direct searches. The probed models include feebly-interacting dark matter, hidden sector models mediated by a heavy neutral scalar, dark photon models and a model featuring photo-phobic axion-like particles.
Speaker: Simon Jeannot -
4
Limits on an Exotic Higgs Decay From a Recast ATLAS Four-Lepton Analysis
The ATLAS collaboration, using 139 fb$^{-1}$ of 13 TeV collisions from the Large Hadron Collider, has placed limits on the decay of a $Z$ boson to three dark photons. We reproduce the results of the ATLAS analysis, and then recast it as a limit on a exotic Higgs decay mode, in which the Higgs boson decays via a pair of intermediate (pseudo)scalars $a$ to four dark photons $V$ (or some other spin-one meson). Across the mass range for $m_a$ and $m_V$, we find limits on the exotic Higgs branching fraction BR$(H\to aa \to VVVV)$ in the range of $4\times 10^{-5}$ to $1 \times 10^{-4}$.
Speaker: Junyi Cheng (Harvard University) -
5
Constraining composite Higgs models with LHC data
I will present the results of two recent recasting studies using MadAnalysis5, CheckMATE and Rivet/Contur. I will focus on processes that are interesting for composite Higgs models. The first is a study of Drell-Yan production of extended Higgs sectors, putting upper limits on cross section times branching ratio for a large number of final states. Besides the limits themselves, this serves as an interesting overview of the recasting landscape. In a second step we consider pair production of vector-like quarks with exotic decays via the aforementioned scalars.
Speaker: Manuel Kunkel -
15:35
coffee & tea
-
6
Reinterpreting LHC Dark QCD results
In this contribution, we will present our recent studies on recasting various ATLAS and CMS analyses in the context of dark QCD models. The analyses were implemented in MadAnalysis5 and the reinterpretation is done focussing on an s-channel Z' portal decaying to two dark quarks which lead to a signature of semi-visible jets with varying invisible fraction.
Speaker: Nicoline Hemme -
7
Tri-Lepton Signal of HNL in an EFT Framework
Heavy neutral lepton (HNL) states can exist at the electroweak scale and can be investigated through current and future collider experiments. The scenario, where other new physics interactions occur at scales much higher than the HNL scale, can be described using an effective field theory (EFT) framework known as NR-EFT. We focus on constraining the Wilson coefficients of NR-EFT operators, which primarily contribute to tri-lepton plus missing energy signals at the LHC. We scrutinise the implication of the cuts proposed in the HNL search by CMS for the scenario where HNL production at LHC is dominated by NR-EFT operators and propose new variables, which are more relevant for some of the signal topology.
Speaker: Manimala Mitra -
8
Constraints on a Two Mediator Dark Matter Model from Simplified Models
We consider a simple Dark Matter model containing both a scalar and a vector mediator. The model provides a framework for generating the mediators and Dark Matter masses while maintaining a small number of free parameters. Using simplified model results from missing energy and resonance searches, we discuss how well this model is probed by current searches and the interplay between the distinct types of searches. Some of the limitations and issues found when trying to reinterpret simplified model results from exotics searches will also be addressed.
Speaker: Andre Lessa (CCNH - Univ. Federal do ABC) -
9
Dark Matter from Anomaly Cancellation
We discuss a class of theories that predict a fermionic dark matter candidate from gauge anomaly cancellation. As an explicit example, we study the predictions in theories where the global symmetry associated with baryon number is promoted to a local gauge symmetry. In this context, the symmetry-breaking scale has to be below the multi-TeV scale in order to be in agreement with the cosmological constraints on the dark matter relic density. The new physical “Cucuyo” Higgs boson in the theory has very interesting properties, decaying mainly into two photons in the low mass region, and mainly into dark matter in the intermediate mass region. We study the most important signatures at the Large Hadron Collider, evaluating the experimental bounds. We discuss the correlation between the dark matter relic density, direct detection, and collider constraints. We find that these theories are still viable and are susceptible to being probed in current, and future high-luminosity, running.
Speaker: Hridoy Debnath
-
2
-
Experiment(-related) contributions 4/3-006 - TH Conference Room
-
10
CMS public statistical models
This talk will give an overview of the CMS statistical model format (Combine datacards) and demonstrate their evaluation with Combine. In addition the review process automated in gitlab-ci will be highlighted.
Speaker: Dr Giacomo Ortona (Universita e INFN Torino (IT))
-
10
-
1
-
-
Open Data and experiment(-related) contributions 4/3-006 - TH Conference Room
-
11
Research use of event-level open data - how to get there
As agreed in the CERN Open Data policy, all LHC experiments are committed to releasing research-quality open data. CMS has pioneered this effort and now celebrates a decade of regular data releases, with all LHC Run 1 data available in the public domain and ongoing releases of Run 2 data. This talk will provide an opportunity for the audience to reflect on what makes event-level open data reusable for the scientific community and the main challenges involved.
In some years, LHC will enter to the High Luminosity phase, with increasing data volumes and potential resource challenges for open data initiatives. The importance of the community support to the open science activities is crucial. A common understanding that these open data are the scientific heritage of the LHC, and the only available collision data for decades to come is essential for securing the necessary resources beyond Run 2.
Speaker: Kati Lassila-Perini (Helsinki Institute of Physics (FI)) -
12
Data Reuse for MC Tuning and Validation
For the development of modern event generators, the comparison to data is an invaluable tool for the tuning and validation of the code. I will review efforts where the comparisons to data from past colliders have allowed for improvements and extensions of event generators, with an emphasis on photoproduction and hard diffraction at HERA.
Speaker: Peter Meinzinger (Zürich University) -
13
Open Event Generation
This contribution will describe ongoing efforts to provide event generation output to the broader community. There are a number of advantages that such a project could offer: reduced waste, easier project uptake, better validation, and improved communication between the experimental and phenomenological communities, among others.
Speaker: Zach Marshall (Lawrence Berkeley National Lab. (US)) -
14
Current CERN platforms for Reproducible and Interactive scientific analysis
Experiment analysis frameworks, physics data formats, and the expectations of LHC scientists have evolved towards including interactive analysis with short turnaround times and the possibility to optimize reproducible and re-interpretable workflows.
The CERN IT's Pilot Analysis Facility, the CERN Virtual Research Environment, and REANA have emerged as key solutions, as well as a platform dedicated to Machine Learning.
REANA is an open source software, developed mostly within the CERN IT department, that provides a framework focussed on running containerised analysis workflows with an emphasis on the re-analysis and reproducibility of scientific results. We shall describe how REANA has been used notably by the ATLAS collaboration for pMSSM reinterpretations of LHC Run 2 analyses.
The Pilot Analysis Facility facilitates interactive, notebook-based analysis and enables scaling out from SWAN to the local HTCondor managed extensive compute resources.
Similarly, the Virtual Research Environment offers an inclusive, CERN-independent authentication mechanism allowing registered users to benefit from an interactive analysis environment that can be interfaced directly with data management (such as Rucio) and reproducibility (e.g. REANA) tools.
These tools form an integrated ecosystem enabling users to conduct, manage, and replicate complex analyses while promoting accessibility and collaboration. By emphasising user-friendly interfaces and middleware, it optimises data analysis from access and computations to replicable scientific outcomes, fostering open collaboration across diverse physics communities.
This is complemented by the CERN ML Project ( ml.cern.ch ) that aims at offering a centralized service to manage the full machine learning lifecycle, providing access to various accelerator resources like GPUs, TPUs and FPGAs.Speaker: Giovanni Guerrieri (CERN) -
15
Introduction to the Virtual Research Environment: an end-user perspective
One of the objectives of the EOSC (European Open Science Cloud) Future Project was to integrate diverse analysis workflows from Cosmology, Astrophysics and High Energy Physics in a common framework. This led to the inception of the Virtual Research Environment (VRE) at CERN, a prototype platform supporting the goals of Dark Matter and Extreme Universe Science Projects in compliance with FAIR (Findable, Accessible, Interoperable, Reusable) data policies. The goal of the project was to highlight the synergies between different dark matter communities and experiments, by producing new scientific results as well as by making the necessary data and software tools fully available. The VRE makes use of a common authentication and authorisation infrastructure (AAI), and shares the different experimental data (ATLAS, Fermi-LAT, CTA, Darkside, Km3Net, Virgo, LOFAR) in a reliable distributed storage infrastructure via the ESCAPE Data Lake. The entry point of such a platform (for an end-user) is a jupyterhub instance deployed on top of a scalable Kubernetes infrastructure, providing an interactive graphical interface for researchers to access, analyse and share data. The data access and browsability is enabled through API calls to the high level data management and storage orchestration software (Rucio). The VRE aims to streamline the development of end-to-end physics workflows, granting researchers access to an infrastructure that contains easy-to-use physics analysis workflow from different experiments. In this contribution, I will provide an overview of the VRE, highlight its use cases as an analyser for implementing and reproducing experimental analyses on a REANA cluster, and showcase the successful integration of an ATLAS experimental analysis workflow into the VRE platform.
Speaker: Sukanya Sinha (The University of Manchester (GB)) -
10:45
coffee & tea
- 16
-
17
HS³
The complexity of modern high-energy physics (HEP) experiments demands robust, flexible, and interoperable tools for statistical modeling. The HEP Statistics Serialization Standard (HS³) addresses this need by providing a unified framework for serializing statistical models and datasets in HEP research that allows to seamlessly switch between different implementations and modeling frameworks. HS³ ensures compatibility across diverse tools and workflows, promoting reproducibility, transparency, and collaborative analysis. This talk will present the design principles of HS³ and its growing ecosystem of implementations, which includes integrations with popular frameworks. Key use cases will highlight the benefits of HS³ in facilitating cross-experiment collaboration, simplifying model sharing and paving the way towards a future of FAIR (findable, accessible, interoperable, reusable) models. By fostering standardization in HEP data serialization, HS³ aims to accelerate scientific discovery and set a precedent for interoperability in related fields of science.
Speaker: Dr Carsten Burgard (Technische Universitaet Dortmund (DE)) -
18
BDT as a Surrogate Model
I will present the reinterpretation material of the CalRatio + X ATLAS analysis (arXiv:2407.09183). The analysis focuses on neutral long-lived particles decaying within the ATLAS hadronic calorimeter. The reinterpretation involves a Boosted Decision Tree (BDT) trained on truth-level variables to estimate the probability of events within the ABCD plane and assess the sensitivity of the analysis. The BDT weights, along with a Python code example, are available on HEPData to ensure reproducibility. Additionally, I will discuss how this method can be extended to other analyses, providing guidance for broader applications.
Speakers: Abdelhamid Haddad (Laboratoire de Physique de Clermont-Auvergne (LPCA)), Dr Louie Dartmoor Corpe (Laboratoire de Physique Clermont Auvergne (LPCA))
-
11
-
12:30
Lunch
-
-
-
Experiment(-related) contributions 4/3-006 - TH Conference Room
-
19
ATLAS analysis preservation effortsSpeaker: Martin Habedank (University of Glasgow)
-
20
CMS analysis preservation efforts
This talk will cover the analysis preservation efforts in CMS, including BSM analysis implementations in REANA and possibly experience with HEPData
Speaker: Alejandro Gomez Espinosa (Carnegie-Mellon University (US)) -
21
A High-Dimensional and Unbinned SM Measurement with the ATLAS Detector
Traditional approaches to precise Standard Model (SM) measurements of fundamental particles at the LHC generally restrict the format of these measurements to just one or two properties at a time in predetermined histogram bins. The ATLAS Experiment recently published such a measurement in a notable new format for LHC experiments: high-dimensional and unbinned datasets that can be used for a wide range of scientific applications. This precision measurement of high-momentum $Z$ boson events uses neural networks to reduce detector distortions and therefore facilitate direct comparison with theoretical QCD predictions. Physicists can easily configure the datasets to produce traditional binned measurements of any of the measured properties, or arbitrary combinations of them, with full uncertainty covariances and customized binning.
Speaker: Mariel Pettee (Lawrence Berkeley National Lab. (US)) -
22
Machine-learning preservation from ATLAS' view
As machine learning becomes increasingly embedded in ATLAS analyses, ensuring the long-term usability of ML-based results poses several challenges. This talk will examine different aspects of ML preservation, including storing and documenting trained models, handling evolving software dependencies, and ensuring accessibility of input features tied to detector conditions. We will discuss the difficulties of rerunning ML-based analyses years after publication and highlight key areas where improvements are needed to support reinterpretation and reproducibility in the future.
Speaker: Tomasz Procter (University of Glasgow (GB)) -
23
Machine learning preservation in CMS
This talk summarizes recommendations and status of efforts towards preservation of ML models (and accompanying validation material) in CMS, in particular in view of reuse for BSM (re)interpretations outside the collaboration.
Speaker: Joshua Hiltbrand (Baylor University (US)) -
24
Delphes: status and ideas towards automated tuningSpeaker: Michele Selvaggi (CERN)
-
19
-
-
-
25
white paper invitation 4/3-006 - TH Conference Room
-
Likelihoods, public reinterpretation tools 4/3-006 - TH Conference Room
-
26
Profile Likelihoods on ML-Steroids
Global SMEFT analyses combine a vast range of LHC measurements to construct likelihoods to put constraints on physics beyond the Standard Model. However, constructing and evaluating profile likelihoods for such analyses is computationally intensive and prone to instability and noise. We show how modern numerical techniques, similar to neural importance sampling, can dramatically enhance both efficiency and stability. Specifically, we focus on datasets used in previous SFitter analyses, combining data from the Top sector with Higgs, Di-Boson, and electroweak precision measurements to simultaneously constrain up to 42 Wilson coefficients.
Speaker: Nikita Schmal -
27
Exact-Approximate Collider Likelihoods
Monte Carlo simulations to interpret searches for new physics result in noisy approximate estimators of selection efficiencies and likelihoods. In this talk, I present an exact-approximate MCMC method that returns unbiased exact inferences despite the underlying noisy simulation. I will introduce a Poisson likelihood unbiased estimator and show its behaviour in the context of a search for neutralinos and charginos at the LHC. I will show that the resulting inferences are robust with respect to the number of generated events so that that exact approximate inference can be obtained without significant additional computational cost.
Speaker: Christopher Chang -
28
WorkspaceExplorer: Web-based visualisation of full likelihoods
In recent years, LHC experiments are increasingly publishing the full likelihood models for statistical analysis in HEPData. For the purporse of reproducing and reinterpreting such analyses, it is essential to understand the contents of these statistical workspaces. As part of the validation process for an ATLAS combination of searches for Beyond-Standard-Model particles, I have developed an open-access web-based interface to easily and quickly visualise workspace contents and perform fits, called "WorkspaceExplorer", which is freely available at workspaceexplorer.app.cern.ch. In addition to validation, it can be a convenient tool for exploration of unfamiliar statistical models, lowering the barrier for using these for reinterpretations. In this talk, I will present available features and discuss possible use cases for the tool as well as the potential for further developments.
Speaker: Volker Andreas Austrup (The University of Manchester (GB)) -
10:00
coffee & tea
-
29
Seeking an explanation of compressed spectrum excesses at the LHC
ATLAS and CMS have conducted two pairs of broadly equivalent analyses, and find excesses in all four cases. This consists of searches for soft leptons (originally in supersymmetric models with compressed spectra) and monojets. I will describe recent work on trying to interpret these excesses, and quantify their significance, in terms of both supersymmetric and non-supersymmetric models, and to tie them to other observations such as dark matter. I will also describe the numerical tools that are being developed along the way.
Speaker: Mark Dayvon Goodsell (Centre National de la Recherche Scientifique (FR)) -
30
Assessing the correlation between MadAnalysis and Rivet implementations
The landscape of frameworks for (re)implementing HEP analyses for reinterpretation is, fortunately, diverse. One advantage of this diversity is that multiple reimplementations of the same HEP analysis in different frameworks can be cross-validated. On the downside, statistical combinations of analyses across different frameworks must carefully
avoid double-counting events, which could erroneously inflate statistical significance. In both cases, validation and statistical combination, assessing the correlation between implementations in different frameworks is essential.
In this talk, we present studies determining this correlation for analysis implementations in two selected frameworks: MadAnalysis and Rivet. We highlight key achievements, challenges encountered, and lessons learned, providing insights for future developments.Speaker: Martin Habedank (University of Glasgow) -
31
ML-based reweighting of BSM signal grids: an example in Rivet
Following on from discussions at previous reinterpretation forums on the reinterpretation of ML-based searches; we look at how one of these (ATLAS-SUSY-2018-30) has been used as a test-bed for studies of ML-based reweighting, which can reduce the computational load of studying large parameter spaces. Some new features of YODA 2 which allowed the project to run in Rivet will also be highlighted.
Speaker: Tomasz Procter (Jagiellonian University) -
32
Contur update
An update on Contur (Contraints On New Theories Using Rivet) developments
Speaker: Joe Egan (University College London)
-
26
-
12:30
Lunch
-
Public reinterpretation tools 4/3-006 - TH Conference Room
-
33
ADL/CutLang developments for (re)interpretation
We present recent developments in the Analysis Description Language (ADL) and the runtime interpreter CutLang in the context of (re)interpretation studies. The talk will cover ongoing validation efforts from LHC BSM analyses and various improvements to the infrastructure to accommodate analysis implementation and validation requirements. We will also present studies using ATLAS open data. Additionally, we will highlight key advancements towards building a more formal, robust, and automated interpreter system.
Speaker: Ahmetcan Sansar (Istanbul University (TR)) -
34
Updates to LHC simulations in ColliderBit
ColliderBit is the GAMBIT module responsible for simulating LHC physics in the GAMBIT global fitting tool. In this talk we will present recent technical developments in ColliderBit, including new LHC analysis implemented, improvements to three-body decay kinematics relevant for supersymmetry searches, as well as updates to interfaces to other codes.
Speaker: Are Raklev (University of Oslo (NO)) -
35
Recent developments in CheckMATE
I will discuss recent developments and new features in CheckMATE, in particular multibin signal regions, new searches using machine learning methods and (perhaps) statistical combination of searches.
Speaker: Krzysztof Rolbiecki (Warsaw University) -
36
SModelS v3: Going Beyond Simple Z2 Topologies
SModelS is a public tool for fast reinterpretation of LHC searches for new physics based on a large database of simplified model results. Version 3 includes a new framework based on a description of arbitrary simplified model topologies as directed graphs. This new development allows the tool to go beyond Z2-preserving topologies, including results from resonance searches, R-Parity violating supersymmetry and more. In this talk we present the main new features of version 3 and illustrate their impact on a Dark Matter model containing two mediators.
Speaker: Andre Lessa (CCNH - Univ. Federal do ABC) -
15:20
coffee & tea
- 37
-
38
Reinterpretation Enhancements with Rivet v4
The Rivet framework is widely used for analysis preservation and Monte Carlo validation, featuring around 2000 analysis routines, and is primarily used in BSM reinterpretation through the measurement-focused Contur method and tool. However, it is much less established in preservation of BSM searches, despite possessing the essential features such as detector "smearing" of physics objects, and cutflow computation. In this talk we review the major developments in the new Rivet version 4 and its underlying statistics library YODA2, which provide more coherent statistical breakdowns of MC predictions (including automatic propagation of theory uncertainties) and improvements for high-performance computing deployments. We will also present a new user-facing interface and tools designed to reduce entry barriers for BSM-search physics users, and streamline the preservation of BSM analyses in the Rivet collection.
Speaker: Andy Buckley (University of Glasgow (GB)) -
39
HEP Packaging Coordination: Reproducible reuse by default
While advancements in software development practices across particle physics and adoption of Linux container technology have made substantial impact in the ease of replicability and reuse of analysis software stacks, the underlying software environments are still primarily bespoke builds that lack a full manifest to ensure reproducibility across time. The HEP Packaging Coordination community project is bootstrapping packaging of the broader community ecosystem on conda-forge. This process covers multi-platform packaging from low level language phenomenology tools, to the broader simulation stack, to end user analysis tools, and the reinterpretation ecosystem. When combined with next generation scientific package management and manifest tools, the creation of fully specified, portable, and trivially reproducible environments becomes easy and fast, even with the use of hardware accelerators. This ongoing process significantly lowers technical barriers across tool development, distribution, and use, and when combined with public data products provides a transparent system for full analysis reinterpretation and reuse.
Speaker: Matthew Feickert (University of Wisconsin Madison (US)) -
40
Discussion of ESPPU white paper
-
33
-
25
-
-
Reinterpretation studies/pheno -cont- 4/3-006 - TH Conference Room
-
41
Revisiting the LHC Constraints on Gauge-Mediated Supersymmetry Breaking Scenarios
We revisit Gauge Mediated SUSY Breaking (GMSB) scenarios in the context of LHC data. The ATLAS mono-photon search at 139 inverse femtobarn integrated luminosity at the 13 TeV LHC, in the context of a simplified General Gauge Mediation (GGM) scenario (which is a phenomenological version of GMSB with an agnostic approach to the nature of the hidden sector), relies on assumptions that do not hold across the entire parameter space. We identify a few crucial assumptions regarding the decay widths of SUSY particles into final states with gravitinos that affect the LHC limits on the masses of the SUSY particles. Our study aims to reinterpret the ATLAS constraints on the gluino-NLSP mass plane, considering all possible decay modes of SUSY particles in a realistic GGM model.
Speaker: Mr Rameswar Sahu -
42
Revisiting Universal Extra-Dimension Model with Gravity Mediated Decays
We explore the collider phenomenology of the fat-brane realization of the Minimal Universal Extra Dimension (mUED) model, where Standard Model (SM) fields propagate in a small extra dimension while gravity accesses additional large extra dimensions. This configuration allows for gravity-mediated decay (GMD) of Kaluza-Klein (KK) particles, resulting in unique final states with hard photons, jets, massive SM bosons, and large missing transverse energy due to invisible KK gravitons. We derive updated constraints on the model's parameter space by recasting ATLAS mono-photon, di-photon, and multi-jet search results using 139 inverse femtobern of integrated luminosity data.
Speaker: Kirtiman Ghosh -
43
Sensitivity of LHC searches to Inert Doublet Model via Recasting with CheckMATE2
Recasting is an extremely powerful tool to derive limits on new physics models. With so many NP models at our disposal, recasting makes it easy to use the limits derived on certain models by experimental searches, to constrain any model of our choice. However, this method can fail, if the model of our interest not only differs from the one it is being recasted from, in terms of event rates, but also leads to significantly different final state kinematics. In such cases, the experimental search, optimized for a specific model may become completely insensitive to the new model under study. A dedicated search would then be necessary to probe interesting regions of the new model. We present such a case for DM models, namely Inert Doublet Model and perform a re-interpretation of ATLAS full run-2 data on it.
Speaker: Jayita Lahiri -
10:00
coffee & tea
-
44
New Higgses at the electroweak scale
Statistically significant di-photon excess at 95 GeV (>3sigma) and 152 GeV (>4sigma) have been observed. Furthermore, strong tensions exist between the SM predictions and measurements of W and top-like signatures at the LHC. I discuss the status of these anomalies and detail the reinterpretation studies done on the relevant ATLAS searches to assess how the excesses can be related to new Higgses at the electroweak scale."
Speaker: Andreas Crivellin (University of Zurich (CH)) -
45
Flavour anomalies and implications for BSMSpeaker: Nazila Mahmoudi (CERN and Lyon University (FR))
-
41
-
46
Discussion - from RiF to REI WG 4/3-006 - TH Conference Room
-
47
Final remarks and future plans 4/3-006 - TH Conference RoomSpeakers: Martin Habedank (University of Glasgow), Sabine Kraml (LPSC Grenoble), Sezen Sekmen (Kyungpook National University (KR))
-
12:30
Lunch
-