Speaker
Dr
Oliver Gutsche
(FERMILAB)
Description
The CMS computing model to process and analyze LHC collision data
follows a data-location driven approach and is using the WLCG
infrastructure to provide access to GRID resources. As a preparation
for data taking beginning end of 2007, CMS tests its computing model
during dedicated data challenges.
Within the CMS computing model, user analysis plays an important role
in the CMS computing strategy and poses a special challenge for the
infrastructure with its random distributed access patterns. For this
purpose, CMS developed the CMS Remote Analysis Builder (CRAB). CRAB
handles all interactions with the WLCG infrastructure transparently
for the user.
During the 2006 challenge, CMS set its goal to test the infrastructure
at a scale of 50,000 user jobs per day using CRAB. Both direct
submissions by individual users and automated submissions by robots
were used to achieve this goal. A report will be given about the
outcome of the user analysis part of the challenge and observations
made during these tests using both the EGEE and OSG parts of the WLCG
will be presented. The test infrastructure will be described and
improvements made during the challenge to reach the target scale will
be discussed.
In particular, the most prominent difference in the submission
structure between both GRID middlewares will be discussed with regard
to its impact on the challenge. EGEE uses a resource broker submission
approach while OSG uses direct Condor-G submissions.
For 2007, CMS plans to increase the scale of the tests by a factor of
2. A report on work done in 2007 in the context of preparation for the
summer 2007 data challenge will be given and first results will be
presented.
Submitted on behalf of Collaboration (ex, BaBar, ATLAS) | CMS Computing group |
---|
Primary author
Dr
Oliver Gutsche
(FERMILAB)