Speaker
Ian Fisk
(Fermi National Accelerator Laboratory (FNAL))
Description
CMS is the the process of commissioning a complex detector and a globally distributed computing model simultaneously. The represents a unique challenge for the current generation of experiments. Even at the beginning there is not sufficient analysis or organized processing resources at CERN alone. In this presentation we will discuss the unique computing challenges CMS expects to face during the first year of running and how those influence the baseline computing model decisions. During the early accelerator commissioning periods, CMS will attempt to collect as many events as possible when the beam is on to provide adequate early commissioning data. Some of these plans involve overdriving the Tier-0 infrastructure during data collection with recovery when the beam is off. In addition to the larger number of triggered events, there will be pressure in the first year to collect and analysis more complete data formats as the summarized formats mature. The large event formats impact the required storage, bandwidth, and processing capacity across all the computing centers. While the understanding of the detector and the event selections is being improved, there will likely be a larger number of reconstruction passes and skims performed by both central operations and individual users. We will discuss how these additional stresses impact the allocation of resources and the changes from the baseline computing model. We will also present the results of the commissioning tests to ensure the system can gracefully handle the additional requirements.
Author
Ian Fisk
(Fermi National Accelerator Laboratory (FNAL))