Speaker
Description
The offline software and computing systems of the LHC experiments continue to evolve to meet the challenges of delivering data effectively to LHC analysts. Looking to Run 3 and high-luminosity LHC, the data rates required by the HL-LHC physics program will far outstrip what can be provided by the current analysis and production computing approaches. In this presentation, we will discuss how the entire offline system of the CMS experiment is evolving in anticipation of increased data volumes, increased rate of pileup interactions and the evolution of computing technology. We will discuss short-term developments and longer-term ideas including changes to distributed analysis-data structures and data reduction approaches; adoption of modern software development practices; the use of hardware accelerators; and novel approaches to simulation, reconstruction and analysis algorithms.