Speaker
Description
Statistical inference is a crucial part of HEP analyses. Historically based on RooFit and RooStats, the statistical tools used by the experiments are now facing unprecedented challenges, such as the rapidly growing complexity of statistical models - involving hundreds of parameters of interest and thousands of nuisance parameters - the need for scalable performance in large likelihood minimizations, and the demand for interoperability across an increasingly diverse ecosystem of tools, computational hardware, and frameworks and libraries, especially the ones developed and used within the machine learning world.
This talk summarizes status and future plans for the statistical tools used by some of the main LHC experiments (CMS, ATLAS), with a focus on improvements coming from the ROOT world (Roofit automatic differentiation), interoperability with modern libraries (JAX) and communication across frameworks (HS3).