Speaker
Description
For Run III (2021 onwards) of the LHC, LHCb will take data at an instantaneous luminosity of 2 × 10^{33} cm−2 s−1, five times higher than in Run II (2015-2018). To cope with the harsher data taking conditions, the LHCb collaboration will upgrade the DAQ system and install a purely software based trigger, in addition to various detector upgrades. The high readout rate contributes to the challenge of reconstructing and selecting events in real time.
Special emphasize in this talk will be put on the need for fast track reconstruction in the software trigger. I demonstrate how the modified detector infrastructure will be able to face this challenge and discuss the necessary changes to the reconstruction sequence. I present a novel strategy to distribute and maximise the bandwidth among the different physics channels using a genetic algorithm.
The data processing chain includes a re-design of the event scheduling, introduction of concurrent processing, optimisations in processor cache accesses and code vectorisation. Furthermore changes in the areas of event model, conditions data and detector description are foreseen.