25–29 Sept 2006
Valencia, Spain
Europe/Zurich timezone

Commissioning and calibration of the CMS micro-strip tracker

27 Sept 2006, 16:20
1h 40m
Valencia, Spain

Valencia, Spain

IFIC – Instituto de Fisica Corpuscular Edificio Institutos de Investgación Apartado de Correos 22085 E-46071 València SPAIN

Speaker

Robert Bainbridge (Imperial College London)

Description

The CMS micro-strip tracker data acquisition system is based on an analogue front-end ASIC, optical readout and an off-detector VME board that performs digitization, zero-suppression and data formatting before forwarding event fragments to the online event-building farm. Sophisticated “commissioning” procedures are required to optimally configure, calibrate and synchronize the 10M readout channels. The procedures are defined by data acquisition loops that configure and control the readout and local trigger systems, perform event building and data analysis. We present an overview of the commissioning procedures and results from the CMS Cosmic Challenge and large-scale system tests at the Tracker Integration Facility.

Summary

The micro-strip tracker for the CMS experiment, comprising a sensitive area of over 200m2 and 10M readout
channels, is unprecedented in terms of size and complexity. The readout system is based on a 128-channel
analogue front-end ASIC, optical readout and an off-detector VME board that uses FPGA technology to
perform digitization, zero suppression and data formatting before forwarding event fragments to the CMS
online computing farm.

Commissioning such a large-scale readout system requires sophisticated procedures to bring the detector
into an operational state that is suitable for physics data-taking. These procedures comprise several
independent tasks that fall into one of the following categories: automated detection of the readout system
partitioning and cabling; optimization of hardware configurations; synchronization of the front-end system,
both internally and to LHC collisions; and determination of calibration constants that are used by the hardware
and, in some cases, the CMS reconstruction software. These procedures will be used to validate the
operational functionality and performance of the detector during the start-up phase of the experiment and
will also be performed between fills to guarantee optimum detector performance during the subsequent
period of data taking.

The software implementation for the commissioning procedures is divided between the CMS online and offline
software frameworks, known as XDAQ and CMSSW, respectively. XDAQ provides a core set of services and
tools, including: a fast communication protocol for peer-to-peer messaging between processes registered
with the framework; a finite-state machine schema and a slower communication protocol for configuration of
the framework processes; and standard event builder and memory management tools. CMSSW is the offline
software project comprising physics simulation, reconstruction, analysis and High-Level Trigger software, and
provides services such as a conditions database and a Data Quality Monitoring framework.

The commissioning procedures are defined by data acquisition loops that configure and control the readout
and local trigger systems, perform event building and data analysis. Communication between the various,
distributed “hardware supervisor” processes is achieved using the XDAQ framework, which allows to automate
the data acquisition loops, so removing the need for repetitive run control sequences and complex book-
keeping. Consequently, this accelerates detector commissioning and start-up. Data analysis is performed
within CMSSW, which determines optimized hardware configuration parameters and calibration constants from
reconstructed calibration pulses, timing delay curves, dynamic range curves and other features of the front-
end ASIC data stream. These optimized configurations and calibrations are then stored in a “hardware
configuration and calibration” database and provide the basis for subsequent commissioning tasks or physics
runs.

The software design ensures that both the local computing resources allocated to the tracker sub-detector
and the global resources provided by the online computing farm can be used transparently. The former option
will be the default configuration used during the start-up phase. The latter offers significant improvements in
detector readout speeds and CPU processing power, thus providing the possibility to reduce turn-around
times between physics runs.

The software is to be used for the final, complete micro-strip tracker, as well as during the CMS Cosmic
Challenge and large-scale system tests at the CERN Tracker Integration Facility.

Primary author

Robert Bainbridge (Imperial College London)

Presentation materials