Speaker
Description
The second phase of the LHC will collect an unprecedented amount of proton-proton data at the highest centre-of-mass energies ever achieved. The machine is expected to provide an average of 140 simultaneous collisions each bunch crossing at a luminosity of around 5x10³⁴ cm⁻² s-¹. This poses a challenge to the detectors which will have to cope with a harsh radiation environment and will be subject to a high flux of particles. Hence, it is also a challenge for the computing system which will have to reconstruct high multiplicity events identify the hard collision of interest, and provide the performance required to improve the measurement of the Higgs properties to a percent-level precision.
Part of the upgrades of the CMS experiment focuses on the usage of fine granular detectors including precision timing in the calorimeters and with dedicated timing layers placed in front of both the barrel and endcap calorimeters. As part of this strategy, the upgraded endcap calorimeter (HGCAL) will provide measurements of energy and time for particles using approx. 6M channels. The next years will be dedicated to the construction and commissioning of HGCAL.
In this project, the algorithms calibration of the energy and time measurements in HGCAL will be explored using data from beam tests, cosmic muons, laser or charge injection to establish a fast local reconstruction algorithm. The algorithms will be implemented using the heterogenous computing paradigm such that they can be run in both CPU and GPUs and meet the budget foreseen of O(50 ms) to decode the binary data and perform the local reconstruction for a high-level trigger.