Speaker
Dirk Hufnagel
(for the CMS Offline/Computing group)
Description
With the upcoming LHC engineering run in November, the CMS Tier0 computing
effort will be the one of the most important activities of the experiment.
The CMS Tier0 is responsible for all data handling and processing of real
data events in the first period of their life, from when the data is
written by the DAQ system to a disk buffer at the CMS experiment site to
when it is transferred from CERN to the Tier1 computer centers.
The CMS Tier0 accomplishes three principle processing tasks: the realization
of the data streaming model of CMS, the automated production of calibration
and alignment constants and the first full reconstruction of the raw data.
The presentation will describe the data streaming model of CMS and how
this is implemented in the CMS trigger/DAQ and the Tier0. For the Tier0
this implementation means a reorganization of the data from a format determined
by the demands of the CMS trigger/DAQ to a format that is determined by
physics demands. We will also describe the design and implementation of the
Prompt Calibration and Prompt Reconstruction workflows. The data flow
underlying these workflows will be shown and first results from data
challenges and scale tests will be presented.
Submitted on behalf of Collaboration (ex, BaBar, ATLAS) | CMS |
---|
Author
Dirk Hufnagel
(for the CMS Offline/Computing group)