The recent measurement of the W mass by CMS employs many cutting edge data analysis techniques that will become essential in the years to come as the volume of data and Monte Carlo simulations increase through the HL-LHC era. The analysis exploits the CMS NanoAOD data format, together with an optimized analysis pipeline using RDataFrame, Eigen, Boost Histograms, and the python scientific ecosystem. The statistical analysis, a complex multi-dimensional maximum likelihood fit, uses Tensorflow for fast and accurate calculation of the likelihood and its derivatives for robust minimization. In this seminar, we will cover all these aspects, with emphasis on both the computational efficiency, but also the validation and reproducibility of the full analysis chain
SPEAKER'S BIO: David Walter completed his PhD at DESY in 2022, where his research on CMS data analysis focused on top quark physics and precision Z boson measurements for luminosity determination. As a Senior Research Fellow at CERN, he contributed in finalizing the CMS W boson mass measurement and advancing future precision analyses by further developing the analysis chain, setting up statistical analysis methods, and improving the computational efficiency, maintainability, and reproducibility of the physics results.