Speaker
Federica Fanzago
(INFN-PD)
Description
During september 2007 the LHC accelerator will start its
activity.
CMS, one of the four LHC experiment, will produce a large
amount of data that should
be stored and analyzed.
The CMS computing model is based on the grid paradigm: data
are spread
and accessed on a number of geographically distributed
computing centers.
Until real data are not available, the CMS community needs
simulated data to study
the detector response, the forseen physics interaction and
to get experience with
management and analysis data. So a large number of simulated
data are produced and
distributed among computing centres. Real data will be
analyzed by physicist at an
expected rate of ~100000 jobs per day using the grid
infrastructure.
To reach this analysis goals, CMS is developig CRAB (Cms
Remote Analysis Builder),
a user friendly tool to allow a generic users without
knowledge of grid
infrastructure to access data and perform its analysis as
simply as in a local
environment.
CRAB is deployed by CMS to access remote data and it takes
care to interact with all
Data Management services, from data discovery and location
to output file management.
An overview of the current implementation of this tool, its
interaction with grid
middleware and its usage is presented in this work.
Author
Federica Fanzago
(INFN-PD)
Co-authors
Marco Corvo
(CNAF-CERN)
alessandra fanfani
(UNIBO)
daniele spiga
(INFN_PG)
fabio farina
(INFN-MI)
nicola defilippis
(INFN-BA)
stefano lacaprara
(INFN-LNL)