Marco Adinolfi (University of Bristol)
After several years of experience with Grid production and Analysis dealing with simulated data, the first LHC collision data (as of March 2010) have confronted the LHCb Computing Model with real data. The LHCb Computing Model is somewhat different from the traditional MONARC hierarchical model used by the other LHC experiments: first pass reconstruction, as well as further reprocessings, are performed at a set of 7 Tier-1 sites (including CERN), while Tier2 sites are used mainly for simulation productions. User analysis is performed at LHCb Analysis Centres for which the timeline is the 7 Tier1s.Event reconstruction is enabled only after thorough checking of the quality of the data. In case there is a need for a new calibration or alignment of the detector, new calibration constants are generated and certified, before the reconstruction can proceed. Analysis relies on the concept of reduced datasets (so-call stripped datasets) that are centrally produced at the 7 Tier-1's and then distributed to all the analysis centres. We shall review the performance of this model with the 2010 real data, and give an outlook for possible modifications to be put in place for the 2011 run.
Speaker’s Bureau LHCB (Institute for Theoretical and Experimental Physics (ITEP))