CERN Accelerating science

Talk
Title CMS experience with online and offline Databases
Video
Loading
If you experience any problem watching the video, click the download button below
Download Embed
Mp4:High
(600 kbps)
Windows Media:Medium
(480 kbps)
Flash:High
(753 kbps)
High-resolution:
Copy-paste this code into your page:
Author(s) Pfeiffer, Andreas (speaker) (CERN)
Corporate author(s) CERN. Geneva
Imprint 2012-05-21. - Streaming video, 00:20:48:00.
Series (Conferences)
(Computing in High Energy and Nuclear Physics (CHEP) 2012)
Lecture note on 2012-05-21T16:35:00
Subject category Conferences
Abstract The CMS experiment is made of many detectors which in total sum up to more than 75 million channels. The online database stores the configuration data used to configure the various parts of the detector and bring it in all possible running states. The database also stores the conditions data, detector monitoring parameters of all channels (temperatures, voltages), detector quality information, beam conditions, etc. These quantities are used by the experts to monitor the detector performance in detail, as they occupy a very large space in the online database they cannot be used as-is for offline data reconstruction. For this, a "condensed" set of the full information, the "conditions data", is created and copied to a separate database used in the offline reconstruction. The offline conditions database contains the alignment and calibrations data for the various detectors. Conditions data sets are accessed by a tag and an interval of validity through the offline reconstruction program CMSSW, written in C++. Performant access to the conditions data as C++ objects is a key requirement for the reconstruction and data analysis. About 200 types of calibration and alignment exist for the various CMS sub-detectors. Only those data which are crucial for reconstruction are inserted into the offline conditions DB. This guarantees a fast access to conditions during reconstruction and a small size of the conditions DB. Calibration and alignment data are fundamental to maintain the design performance of the experiment. Very fast workflows have been put in place to compute and validate the alignment and calibration sets and insert them in the conditions database before the reconstruction process starts. Some of these sets are produced analyzing and summarizing the parameters stored in the online database. Others are computed using event data through a special express workflow. A dedicated monitoring system has been put up to monitor these time-critical processes. The talk describes the experience with the CMS online and offline databases during the 2010 and 2011 data taking periods, showing some of the issues found and lessons learned.
Copyright/License © 2012-2024 CERN
Submitted by jd@bnl.gov

 


 Record created 2012-07-09, last modified 2022-11-02


External links:
Download fulltextTalk details
Download fulltextEvent details