22-28 July 2010
Palais des Congrès de Paris
Europe/Paris timezone

LHCb Computing experience with first data

24 Jul 2010, 14:20
Salle 252A ()

Salle 252A

Parallel Session Talk 13 - Advances in Instrumentation and Computing for HEP 13 - Advances in Instrumentation and Computing for HEP


Marco Adinolfi (University of Bristol)


After several years of experience with Grid production and Analysis dealing with simulated data, the first LHC collision data (as of March 2010) have confronted the LHCb Computing Model with real data. The LHCb Computing Model is somewhat different from the traditional MONARC hierarchical model used by the other LHC experiments: first pass reconstruction, as well as further reprocessings, are performed at a set of 7 Tier-1 sites (including CERN), while Tier2 sites are used mainly for simulation productions. User analysis is performed at LHCb Analysis Centres for which the timeline is the 7 Tier1s.Event reconstruction is enabled only after thorough checking of the quality of the data. In case there is a need for a new calibration or alignment of the detector, new calibration constants are generated and certified, before the reconstruction can proceed. Analysis relies on the concept of reduced datasets (so-call stripped datasets) that are centrally produced at the 7 Tier-1's and then distributed to all the analysis centres. We shall review the performance of this model with the 2010 real data, and give an outlook for possible modifications to be put in place for the 2011 run.

Primary author

Speaker’s Bureau LHCB (Institute for Theoretical and Experimental Physics (ITEP))

Presentation Materials