Conveners
Plenary Session Wednesday
- Nikolai Hartmann (Ludwig Maximilians Universitat (DE))
- Matthew Feickert (University of Wisconsin Madison (US))
Plenary Session Wednesday
- Jim Pivarski (Princeton University)
- Eduardo Rodrigues (University of Liverpool (GB))
As pyhf
continues to be developed and as the user community has grown significantly, both in users and in subfields of physics, the needs of the user base have begun to expand from beyond simple inference tasks. In this tutorial we will cover some recent features added to pyhf
as well as give a short tour of possible example use cases across high energy...
The statistical models that are used in modern HEP research are independent of their specific implementations. As a consequence, many different tools have been developed to perform statistical analyses in HEP. These implementations differ both in their performance, but also in their usability. In this scenario, comparative benchmarks are essential to aid users in choosing a library and to...
When enumerating the environments in which HEP researchers perform their analyses, the browser may not be the first that comes to mind. Yet, recent innovations in the tooling behind conda-forge, Emscripten, and web-assembly have made it easier than ever to deploy complex, multiple-dependency environments to the web.
An introduction will be given to the technologies that make this possible,...
Collider physics analyses have historically favored frequentist statistical methodologies, with some exceptions of Bayesian inference in LHC analyses through use of the Bayesian Analysis Toolkit (BAT). In an effort to allow for advanced Bayesian methodologies for binned statistical models based on the HistFactory framework, which is often used in High-Energy physics, we developed the Python...
Data formats for scientific data often differ across experiments due the hardware design and availability constraints. To interact with these data formats, researchers have to develop, document and maintain specific analysis software which are often tightly coupled with a particular data format. This proliferation of custom data formats has been a prominent challenge for the Nuclear and High...
The fast-approaching High Luminosity LHC phase introduces significant challenges and opportunities for CMS. One of its major detector upgrades, the High Granularity Calorimeter (HGCAL), brings fine segmentation to the endcap regions. It requires a fast online trigger system (12.5 us latency) to extract interesting information from the ~100Tb of data produced every second by its custom read-out...