Conveners
Online Computing: Monday
- Wainer Vandelli (INFN)
Online Computing: Monday
- Pierre Vande Vyvre (CERN)
Online Computing: Tuesday
- Gordon Watts (Washington University)
Online Computing: Tuesday
- Rainer Mankel (CERN)
Online Computing: Thursday
- Clara Gaspar (CERN)
Description
Sponsored by ACEOLE
Dr
Johannes Gutleber
(CERN)
3/23/09, 2:00โฏPM
Online Computing
oral
The CMS data acquisition system is made of two major subsystems: event building and event filter.
The presented paper describes the architecture and design of the software that processes the data
flow in the currently operating experiment. The central DAQ system relies heavily on industry
standard networks and processing equipment. Adopting a single software infrastructure in
all...
Werner Wiedenmann
(University of Wisconsin)
3/23/09, 2:20โฏPM
Online Computing
oral
Event selection in the ATLAS High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi threaded online data flow environment. The ATLAS High Level Trigger (HLT) framework based on the GAUDI and...
Mr
SooHyung Lee
(Korea Univ.)
3/23/09, 2:40โฏPM
Online Computing
oral
The real time data analysis at next generation experiments is a challenge because of their enormous data rate and size. The SuperKEKB experiment, the upgraded Belle experiment, requires to process 100 times larger data of current one taken at 10kHz. The offline-level data analysis is necessary in the HLT farm for the efficient data reduction.
The real time processing of huge data is also...
Dr
Clara Gaspar
(CERN)
3/23/09, 3:00โฏPM
Online Computing
oral
LHCb has designed and implemented an integrated Experiment Control System. The Control System uses the same concepts and the same tools to control and monitor all parts of the experiment: the Data Acquisition System, the Timing and the Trigger Systems, the High Level Trigger Farm, the Detector Control System, the Experiment's Infrastructure and the interaction with the CERN Technical Services...
Ms
Chiara Zampolli
(CERN)
3/23/09, 3:20โฏPM
Online Computing
oral
The ALICE experiment is the dedicated heavy-ion experiment at the CERN LHC and will take data with a bandwidth of up to 1.25 GB/s. It consists of 18 subdetectors that interact with five online systems (DAQ, DCS, ECS, HLT and Trigger). Data recorded are read out by DAQ in a raw data stream produced by the subdetectors. In addition the subdetectors produce conditions data derived from the raw...
Prof.
Gordon Watts
(UNIVERSITY OF WASHINGTON)
3/23/09, 3:40โฏPM
Online Computing
oral
The DZERO Level 3 Trigger and data acquisition system has been successfully running since March of 2001, taking data for the DZERO experiment located at the Tevatron at the Fermi National Laboratory. Based on a commodity parts, it reads out 65 VME front end crates and delivers the 250 MB of data to one of 1200 processing cores for a high level trigger decision at a rate of 1 kHz. Accepted...
Daniel Sonnick
(University of Applied Sciences Kaiserslautern)
3/23/09, 4:30โฏPM
Online Computing
oral
In LHCb raw data files are created on a high-performance storage
system using a custom, speed-optimized file-writing software. The
file-writing is orchestrated by a data-base, which represents the
life-cycle of a file and is the entry point for all operations related
to files such as run-start, run-stop, file-migration, file-pinning
and ultimately file-deletion.
File copying to the...
Mr
Matteo Marone
(Universita degli Studi di Torino - Universita & INFN, Torino)
3/23/09, 4:50โฏPM
Online Computing
oral
The CMS detector at LHC is equipped with a high precision lead tungstate
crystal electromagnetic calorimeter (ECAL).
The front-end boards and the photodetectors are monitored using a network
of DCU (Detector Control Unit) chips located on the detector electronics.
The DCU data are accessible through token rings controlled by an XDAQ
based software component.
Relevant parameters are...
Mr
Barthรฉlรฉmy von Haller
(CERN)
3/23/09, 5:10โฏPM
Online Computing
oral
ALICE is one of the four experiments installed at the CERN Large Hadron Collider (LHC), especially designed for the study of heavy-ion collisions.
The online Data Quality Monitoring (DQM) is an important part of the data acquisition (DAQ) software. It involves the online gathering, the analysis by user-defined algorithms and the visualization of monitored data.
This paper presents the final...
Dr
Hannes Sakulin
(European Organization for Nuclear Research (CERN))
3/23/09, 5:30โฏPM
Online Computing
oral
The CMS Data Acquisition cluster, which runs around 10000 applications, is configured dynamically at run time. XML configuration documents determine what applications are executed on each node and over what networks these applications communicate. Through this mechanism the DAQ System may be adapted to the required performance, partitioned in order to perform (test-) runs in parallel, or...
Giovanni Polese
(Lappeenranta Univ. of Technology)
3/23/09, 5:50โฏPM
Online Computing
oral
The Resistive Plate Chamber system is composed
by 912 double-gap chambers equipped with about 10^4 frontend
boards. The correct and safe operation of the RPC system
requires a sophisticated and complex online Detector Control
System, able to monitor and control 10^4 hardware devices
distributed on an area of about 5000 m^2. The RPC DCS acquires,
monitors and stores about 10^5 parameters...
Alina Corso-Radu
(University of California, Irvine)
3/23/09, 6:10โฏPM
Online Computing
oral
ATLAS is one of the four experiments in the Large Hadron Collider (LHC) at CERN which has been put in operation this year. The challenging experimental environment and the extreme detector complexity required development of a highly scalable distributed monitoring framework, which is currently being used to monitor the quality of the data being taken as well as operational conditions of the...
Albert Puig Navarro
(Universidad de Barcelona),
Markus Frank
(CERN)
3/24/09, 2:40โฏPM
Online Computing
oral
The LHCb experiment at the LHC accelerator at CERN will collide particle bunches at 40 MHz. After a first level of hardware trigger with output at 1 MHz, the physically interesting collisions will be selected by running dedicated trigger algorithms in the High Level Trigger (HLT) computing farm. It consists of up to roughly 16000 CPU cores and 44TB of storage space. Although limited by...
Dr
Alessandro Di Mattia
(MSU)
3/24/09, 3:00โฏPM
Online Computing
oral
ATLAS is one of the two general-purpose detectors at the Large Hadron Collider (LHC). The trigger system is responsible for making the online selection of interesting collision events. At the LHC design luminosity of 10^34 cm-2s-1 it will need to achieve a rejection factor of the order of 10^-7 against random proton-proton interactions, while selecting with high efficiency events that are...
Dr
Silvia Amerio
(University of Padova & INFN Padova)
3/24/09, 3:20โฏPM
Online Computing
oral
The Silicon-Vertex-Trigger (SVT) is a processor developed at CDF experiment to perform online fast and precise track reconstruction. SVT is made of two pipelined processors, the Associative Memory, finding low precision tracks, and the Track Fitter, refining the track quality with high precision fits. We will describe the architecture and the performances of a next generation track fitter,...
Vasco Chibante Barroso
(CERN)
3/24/09, 3:40โฏPM
Online Computing
oral
ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). Some specific calibration tasks are performed regularly for each of the 18 ALICE sub-detectors in order to achieve most accurate physics measurements. These procedures involve events analysis in a wide...
Dr
Hans G. Essel
(GSI)
3/24/09, 4:30โฏPM
Online Computing
oral
For the new experiments at FAIR new concepts of data acquisition systems
have to be developed like the distribution of self-triggered, time stamped
data streams over high performance networks for event building.
The Data Acquisition Backbone Core (DABC) is a general purpose software
framework designed for the implementation of such data acquisition systems.
It is based on C++ and...
Dr
Mark Sutton
(University of Sheffield)
3/24/09, 4:50โฏPM
Online Computing
oral
The ATLAS experiment is one of two general-purpose experiments at the Large Hadron Collider (LHC). It has a three-level trigger, designed to reduce the 40MHz bunch-crossing rate to about 200Hz for recording. Online track
reconstruction, an essential ingredient to achieve this design goal, is performed at the software-based second (L2) and third levels (Event Filter, EF), running on farms of...
Belmiro Pinto
(Universidade de Lisboa)
3/24/09, 5:10โฏPM
Online Computing
oral
The ATLAS experiment uses a complex trigger strategy to be able to achieve the necessary Event Filter rate output, making possible to optimize the storage and processing needs of these data. These needs are described in the ATLAS Computing Model, which embraces Grid concepts. The output coming from the Event Filter will consist of three main streams: a primary stream, the express stream and...
Mr
Pablo Martinez Ruiz Del Arbol
(Instituto de Fรญsica de Cantabria)
3/24/09, 5:30โฏPM
Online Computing
oral
The alignment of the Muon System of CMS is performed using different techniques: photogrammetry measurements, optical alignment and alignment with tracks. For track-based alignment, several methods are employed, ranging from a hit-impact point (HIP) algorithm and a procedure exploiting chamber overlaps to a global fit method based on the Millepede approach. For start-up alignment, cosmic muon...
Jean-Christophe Garnier
(Conseil Europeen Recherche Nucl. (CERN)-Unknown-Unknown)
3/24/09, 5:50โฏPM
Online Computing
oral
The High Level Trigger and Data Acquisition system selects about 2 kHz of events out of the 40 MHz of beam crossings. The selected events are consolidated into files on an onsite storage and then sent to permanent storage for subsequent analysis on the Grid. For local and full-chain tests a method to exercise the data-flow through the High Level Trigger when there are no actual data is needed....
Mr
Bjorn (on behalf of the ATLAS Tile
Calorimeter system) Nordkvist
(Stockholm University)
3/24/09, 6:10โฏPM
Online Computing
oral
The ATLAS Tile Calorimeter is ready for data taking during the
proton-proton collisions provided by the Large Hadron Collider (LHC). The
Tile Calorimeter is a sampling calorimeter with iron absorbers and
scintillators as active medium. The scintillators are read out by wave
length shifting fibers and PMTs. The LHC provides collisions every 25ns,
putting very stringent requirements on the...
Mr
Pierre VANDE VYVRE
(CERN)
3/26/09, 4:30โฏPM
Online Computing
oral
ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). A large bandwidth and flexible Data Acquisition System (DAQ) has been designed and deployed to collect sufficient statistics in the short running time available per year for heavy ion and to...
Dr
Jose Antonio Coarasa Perez
(Department of Physics - Univ. of California at San Diego (UCSD) and CERN, Geneva, Switzerland)
3/26/09, 4:50โฏPM
Online Computing
oral
The CMS online cluster consists of more than 2000 computers, mostly under Scientific Linux CERN, running the 10000 applications instances responsible for the data acquisition and experiment control on a 24/7 basis.
The challenging dimension of the cluster constrained the design and implementation of the infrastructure:
- The critical nature of the control applications demands a tight...
Mr
Thilo Pauly
(CERN)
3/26/09, 5:10โฏPM
Online Computing
oral
The ATLAS Level-1 Central Trigger (L1CT) electronics is a central part of ATLAS data-taking. It receives the 40 MHz bunch clock from the LHC machine and distributes it to all sub-detectors. It initiates the detector read-out by forming the Level-1 Accept decision, which is based on information from the calorimeter and muon trigger processors, plus a variety of additional trigger inputs from...
Yoshiji Yasu
(High Energy Accelerator Research Organization (KEK))
3/26/09, 5:30โฏPM
Online Computing
oral
DAQ-Middleware is a software framework of network-distributed DAQ system based on Robot Technology Middleware, which is an international standard of Object Management Group (OMG) in Robotics and developed by AIST. DAQ-Component is a software unit of DAQ Middleware. Basic components are already developed. For examples, Gatherer is a readout component, Logger is a logging component, Monitor is...
Giovanni Organtini
(Univ. + INFN Roma 1)
3/26/09, 5:50โฏPM
Online Computing
oral
The Electromagnetic Calorimeter (ECAL) of the CMS experiment at the LHC
is made of about 75000 scintillating crystals.
The detector properties must be continuously monitored
in order to ensure the extreme stability and precision required by its design.
This leads to a very large volume of non-event data to be accessed continuously by
shifters, experts, automatic monitoring tasks,...
Dr
Ivan Kisel
(GSI Helmholtzzentrum fรผr Schwerionenforschung GmbH, Darmstadt)
3/26/09, 6:10โฏPM
Online Computing
oral
The CBM Collaboration builds a dedicated heavy-ion experiment to investigate the properties of highly compressed baryonic matter as it is produced in nucleus-nucleus collisions at the Facility for Antiproton and Ion Research (FAIR) in Darmstadt, Germany. This requires the collection of a huge number of events which can only be obtained by very high reaction rates and long data taking periods....