Speaker
Description
The calibration of the detector in almost real time is a key to the exploitation of the large data volumes at the LHC experiments. For this purpose the CMS collaboration deployed a complex machinery involving several components of the processing infrastructure and of the condition DB system. Accurate reconstruction of data start only once all the calibrations become available for consumption and relies on continuous and detailed monitoring of the calibration machinery and the physics performance of its products. This monitoring task requires to aggregate, digest, and react upon information from all the different components based on very heterogeneous technologies: the Tier0 processing farm, the Oracle based condition DB, the data quality monitoring framework and various other logging and bookkeeping services . An application has been designed and deployed to spy data from various sources. The application presents information to a web based interface and communicates the readiness of the calibrations to the computing infrastructure. A dedicated application, spying data from these various different sources, presenting them on a web based interface and able to communicate to the computing infrastructure the readiness for reconstruction of any chunk of data, has been designed and deployed. The presentation reports on the design choices and operational experience of this new tool.