Speaker
Description
A decade of data-taking from the LHC has seen great progress in ruling out archetypal new-physics models up to high direct-production energy scales, but few persistent deviations from the SM have been seen. So as we head into the new data-taking era, it is of paramount importance to look beyond such archetypes and consider general BSM models that exhibit multiple phenomenological signatures. But typically each such signature will appear at lower strength than the archetypical simplified models: to significantly constrain them requires a move away from single, "silver-bullet" analyses, to a holistic approach in which many analyses are combined into composite likelihoods. Such combinations require understanding analysis overlaps, and identifying optimal analysis combinations for each point in model space. In this contribution, we present the TACO method, which uses computational statistics in combination with LHC data-reinterpretation tools to estimate analysis correlations, and hence find their optimal combinations. Across several BSM-model scenarios, we show that the TACO approach can significantly increase both exclusion and observation power.
Significance
This contribution is a new method, to correspond to a paper in late stages of preparation, on novel combination of a bootstrapping method for estimating analysis event-sharing correlations and graph-theory approaches for efficiently & scalably identifying analysis combinations with maximum BSM sensitivity from the combinatoric space of all signal regions.
References
Prototype correlation analysis in Les Houches 2019