Speaker
Description
The High-Luminosity LHC will open an unprecedented window on the weak-scale nature of the universe, providing high-precision measurements of the standard model as well as searches for new physics beyond the standard model. Such precision measurements and searches require information-rich datasets with a statistical power that matches the high-luminosity provided by the Phase-2 upgrade of the LHC. Efficiently collecting those datasets will be a challenging task, given the harsh environment of 200 proton-proton interactions per LHC bunch crossing. For this purpose, CMS is designing an efficient data-processing hardware trigger (Level-1) that will include tracking information and high-granularity calorimeter information. The current conceptual system design is expected to take full advantage of FPGA and link technologies over the coming years, providing a high-performance, low-latency computing platform for large throughput and sophisticated data correlation across diverse sources. The envisaged L1 system will more closely replicate the full offline object reconstruction instead to perform a more sophisticated and optimized selection. Algorithms such as particle flow reconstruction can be implemented and complemented by standalone trigger object reconstruction. The expected performance and physics implications of such algorithms are studied using Monte Carlo samples with hιgh pile-up, simulating the harsh conditions of the HL-LHC. The trigger object requirements are not only driven by the need to maintain physics selection thresholds to match those of the Phase-1, the selection of exotic signatures including displaced objects must be provided to help expanding the physics reach of the experiment. The expected acceptance increase on selected benchmark signals obtained by the upgraded CMS Phase-2 Level-1 trigger will be summarized in this presentation.
Secondary track (number) | 12. |
---|