Speaker
Ken Bloom
(University of Nebraska-Lincoln)
Description
Each LHC experiment will produce datasets with sizes of order one petabyte
per year. All of this data must be stored, processed, transferred, simulated
and analyzed, which requires a computing system of a larger scale than ever
mounted for any particle physics experiment, and possibly for any enterprise
in the world. I will discuss how CMS has chosen to address these challenges,
focusing on recent tests of the system that demonstrate the experiment's
readiness for producing physics results with the first LHC data.