Conveners
Track 1 Session: #1 (Upgrades and parallelism)
- Markus Frank (CERN)
Track 1 Session: #2 (Data acquisition and electronics)
- Matthias Richter (DEPARTMENT OF PHYSICS AND TECHNOLOGY, UNIVERSITY OF BERGEN)
- Matthias Richter (University of Oslo (NO))
Track 1 Session: #3 (Data acquisition and electronics)
- Emilio Meschi (CERN)
Track 1 Session: #4 (Online reconstruction and control systems)
- Andrew Norman (Fermilab)
Track 1 Session: #5 (Online reconstruction and control systems)
- Dmytro Kresan (GSI)
- Dmytro Kresan (GSI - Helmholtzzentrum fur Schwerionenforschung GmbH (DE))
Description
Online computing
Frederic Bruno Magniette
(Ecole Polytechnique (FR))
4/13/15, 2:00โฏPM
Track1: Online computing
oral presentation
High-energy physics experiments produce huge amounts of data that need to be processed and stored for further analysis and eventually treated in real time for triggering and monitoring purposes. In addition, more and more often these requirements are also being found on other fields such as on-line video processing, proteomics and astronomical facilities.
The complexity of such experiments...
Dr
paolo branchini
(INFN Roma Tre)
4/13/15, 2:15โฏPM
Track1: Online computing
oral presentation
The Data Acquisition System (DAQ) and the Front-End electronics for an array of Kinetic Inductance Detectors (KIDs) are described. KIDs are superconductive detectors, in which electrons are organized in Cooper pairs. Any incident radiation could break such pairs generating quasi-particles, whose effect is increasing the inductance of the detector. Electrically, any KID is equivalent to a...
Julian Glatzer
(CERN)
4/13/15, 2:30โฏPM
Track1: Online computing
oral presentation
The ATLAS Level-1 Central Trigger (L1CT) system is a central part of
ATLAS data-taking and is configured, controlled, and monitored by a
software framework with emphasis on reliability and flexibility. The
hardware has undergone a major upgrade for Run 2 of the LHC, in order
to cope with the expected increase of instantaneous luminosity of a
factor of 2 with respect to Run 1. It offers...
Eduard Ebron Simioni
(Johannes-Gutenberg-Universitaet Mainz (DE))
4/13/15, 2:45โฏPM
Track1: Online computing
oral presentation
The Large Hadron Collider (LHC) in 2015 will collide proton beams with
increased luminosity from $10^{34}$ up to $3 \times 10^{34}$ cm$^{โ2}$ s$^{โ1}$. ATLAS
is an LHC experiment designed to measure decay properties of highly
energetic particles produced in these proton-collisions. The high
luminosity places stringent physical and operational requirements on
the ATLAS Trigger in order to...
Helio Takai
(Brookhaven National Laboratory (US))
4/13/15, 3:00โฏPM
Track1: Online computing
oral presentation
The global feature extractor (gFEX) is a component of the Level-1
Calorimeter trigger Phase-I upgrade for the ATLAS experiment. It is
intended to identify patterns of energy associated with the hadronic
decays of high momentum Higgs, W, & Z bosons, topquarks, and exotic
particles in real time at the LHC crossing rate. The single processor
board will be implemented as a fast reconfigurable...
Wesley Gohn
(U)
4/13/15, 3:15โฏPM
Track1: Online computing
oral presentation
A new measurement of the anomalous magnetic moment of the muon, $a_{\mu} \equiv (g-2)/2$, will be performed at the Fermi National Accelerator Laboratory. The most recent measurement, performed at Brookhaven National Laboratory and completed in 2001, shows a 3.3-3.6 standard deviation discrepancy with the standard model value of $g$-$2$. The new measurement will accumulate 20 times those...
Ludovico Bianchi
(Forschungszentrum Jรผlich)
4/13/15, 3:30โฏPM
Track1: Online computing
oral presentation
The PANDA experiment is a next generation particle detector planned for operation at the FAIR facility, currently under construction in Darmstadt, Germany. PANDA will detect events generated by colliding an antiproton beam on a fixed proton target, allowing studies in hadron spectroscopy, hypernuclei production, open charm and nucleon structure.
The nature of hadronic collisions means that...
Emilio Meschi
(CERN)
4/14/15, 2:00โฏPM
Track1: Online computing
oral presentation
Technology convergences in the post-LHC era
In the course of the last three decades HEP experiments have had to face the challenge of manipulating larger and larger masses of data from increasingly complex and heterogeneous detectors with hundreds of millions of electronic channels. The traditional approach of low-level data reduction using ad-hoc electronics working on fast analog...
Christopher Jon Lee
(University of Johannesburg (ZA))
4/14/15, 2:15โฏPM
Track1: Online computing
oral presentation
The ATLAS Trigger and Data Acquisition (TDAQ) system is responsible for the online processing of live data, streaming from the ATLAS experiment at the Large Hadron Collider (LHC) at CERN. The online farm is composed of ~3000 servers, processing the data readout from ~100 million detector channels through multiple trigger levels.
During the two years of the first Long Shutdown (LS1) there has...
Reiner Hauser
(Michigan State University (US))
4/14/15, 2:30โฏPM
Track1: Online computing
oral presentation
After its first shutdown, LHC will provide pp collisions with increased luminosity and energy. In the ATLAS experiment the Trigger and Data Acquisition (TDAQ) system has been upgraded to deal with the increased event rates. The Data Flow (DF) element of the TDAQ is a distributed hardware and software system responsible for buffering and transporting event data from the Readout system to the...
Jorn Schumacher
(University of Paderborn (DE))
4/14/15, 2:45โฏPM
Track1: Online computing
oral presentation
The ATLAS experiment at CERN is planning the full deployment of a new, unified link technology for connecting detector front-end electronics on the timescale of the LHC Run 4 (2025). It is estimated that roughly 8000 Gigabit Transceiver links (GBT), with transfer rates probably up to 9.6 Gbps, will replace existing links used for readout, detector control and distribution of timing and trigger...
Remi Mommsen
(Fermi National Accelerator Lab. (US))
4/14/15, 3:00โฏPM
Track1: Online computing
oral presentation
The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event...
Emilio Meschi
(CERN)
4/14/15, 3:15โฏPM
Track1: Online computing
oral presentation
During the LHC Long Shutdown 1, the CMS DAQ system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has...
Georgiana Lavinia Darlea
(Massachusetts Inst. of Technology (US))
4/14/15, 3:30โฏPM
Track1: Online computing
oral presentation
The CMS experiment at CERN is one of the two general-purpose detectors on the Large Hadron Collider (LHC) in the Geneva area, Switzerland. Its infrastructure has undergone massive upgrades during 2013 and 2014, which lead to major changes in the philosophy of its DAQ (Data AcQuisition) system. One of the major components of this system is the Storage Manager, which is responsible for buffering...
Srecko Morovic
(CERN)
4/14/15, 3:45โฏPM
Track1: Online computing
oral presentation
A flexible monitoring system has been designed for the CMS File-based Filter Farm making use of modern data mining and analytics components. All the metadata and monitoring information concerning data flow and execution of the HLT are generated locally in the form of small โdocumentsโ using the JSON encoding. These documents are indexed into a hierarchy of elasticsearch (es) clusters along...
Matthias Richter
(University of Oslo (NO))
4/14/15, 4:30โฏPM
Track1: Online computing
oral presentation
An upgrade of the ALICE detector is currently prepared for the Run 3 period of the Large Hadron
Collider (LHC) at CERN starting in 2020. The physics topics under study by ALICE during this
period will require the inspection of all collisions at a rate of 50 kHz for minimum bias Pb-Pb and 200
kHz for pp and p-Pb collisions in order to extract physics signals embedded into a large...
Alexey Rybalchenko
(GSI - Helmholtzzentrum fur Schwerionenforschung GmbH (DE))
4/14/15, 4:45โฏPM
Track1: Online computing
oral presentation
After Long Shutdown 2, the upgraded ALICE detector at the LHC will produce more than a terabyte of data per second. The data, constituted from a continuous un-triggered stream data, have to be distributed from about 250 First Level Processor nodes (FLPs) to O(1000) Event Processing Nodes (EPNs). Each FLP receives a small subset of the detector data that is chopped in sub-timeframes. One EPN...
Josef Novy
(Czech Technical University (CZ))
4/14/15, 5:00โฏPM
Track1: Online computing
oral presentation
This contribution focuses on the deployment and first results of the new data acquisition system (DAQ) of the COMPASS experiment utilizing FPGA-based event builder. The new DAQ system is developed under name RCCARS (run control, configuration, and readout system).
COMPASS is a high energy physics experiment situated at the SPS particle accelerator at CERN laboratory in Geneva, Switzerland....
Katsuki Hiraide
(the University of Tokyo)
4/14/15, 5:15โฏPM
Track1: Online computing
oral presentation
XMASS is a multi-purpose low-background experiment with a large volume of liquid xenon scintillator at Kamioka in Japan. The first phase of the experiment aiming at direct detection of dark matter was commissioned in 2010 and is currently taking data.
The detector uses ~830 kg of liquid xenon viewed by 642 photomultiplier tubes (PMTs). Signals from 642 PMTs are amplified and read out by 1...
Asato Orii
(urn:Facebook)
4/14/15, 5:30โฏPM
Track1: Online computing
oral presentation
Super-Kamiokande (SK), a 50-kiloton water Cherenkov detector, is one of
the most sensitive neutrino detectors. SK is continuously collecting
data as the neutrino observatory and can be used also for supernova
observations by detecting supernova burst neutrinos.
It is reported that Betelgeuse (640ly) is shrinking 15% in 15 years
(C.H.townes et al. 2009) and this may be an...
260.
The Electronics, Online Trigger System and Data Acquisition System of the J-PARC E16 Experiment
Tomonori Takahashi
(Research Center for Nuclear Physics, Osaka University)
4/14/15, 5:45โฏPM
Track1: Online computing
oral presentation
1. Introduction
The J-PARC E16 experiment aims to investigate the chiral symmetry restoration in cold nuclear matter and the origin of the hadron mass through the systematic study of the mass modification of vector mesons.
In the experiment,
$e^{+}e^{-}$ decay of slowly-moving $\phi$ mesons in the normal nuclear matter density are intensively studied using several nuclear targets (H,...
Mr
Eitaro Hamada
(High Energy Accelerator Research Organization (KEK))
4/14/15, 6:00โฏPM
Track1: Online computing
oral presentation
**1. Introduction**
We developed a DAQ system of the J-PARC E16 Experiment by using the DAQ-Middleware. We evaluated the DAQ system and confirmed that the DAQ system can be applied to the experiment.
The DAQ system receives an average 660MB/spill of data (2-seconds spill per 6 seconds cycle). In order to receive such a large quantity of data, we need a network-distributed system....
Dr
Sergey Linev
(GSI DARMSTADT)
4/14/15, 6:15โฏPM
Track1: Online computing
oral presentation
The *Data Acquisition Backbone Core* (*DABC*) is a C++ software framework that can implement and run various data acquisition solutions on Linux platforms. In 2013 version 2 of *DABC* has been released with several improvements. These developments have taken into account lots of practical experiences of *DABC v1* with detector test beams and laboratory set-ups since first release in 2009. The...
Dr
Andrea Bocci
(CERN)
4/16/15, 9:00โฏAM
Track1: Online computing
oral presentation
The CMS experiment has been designed with a 2-level trigger system: the Level 1 Trigger, implemented on custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a tradeoff between the complexity of the algorithms running on the available computing power, the...
Andrea Perrotta
(Universita e INFN, Bologna (IT))
4/16/15, 9:15โฏAM
Track1: Online computing
oral presentation
The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increases in center-of-mass energy and luminosity will raise the...
Markus Frank
(CERN)
4/16/15, 9:30โฏAM
Track1: Online computing
oral presentation
The LHCb experiment at the LHC accelerator at CERN collects collisions of particle
bunches at 40 MHz. After a first level of hardware trigger with output of 1 MHz,
the physically interesting collisions are selected by running dedicated trigger
algorithms in the High Level Trigger (HLT) computing farm. This farm consists of
up to roughly 25000 CPU cores in roughly 1600 physical nodes...
Tatiana Likhomanenko
(National Research Centre Kurchatov Institute (RU))
4/16/15, 9:45โฏAM
Track1: Online computing
oral presentation
The main b-physics trigger algorithm used by the LHCb experiment is the so-called topological trigger. The topological trigger selects vertices which are a) detached from the primary proton-proton collision and b) compatible with coming from the decay of a b-hadron. In the LHC Run 1, this trigger utilized a custom boosted decision tree algorithm, selected an almost 100% pure sample of...
Sean Benson
(CERN)
4/16/15, 10:00โฏAM
Track1: Online computing
oral presentation
The LHCb experiment will record an unprecedented dataset of beauty and charm hadron decays during Run II of the LHC, set to take place between 2015 and 2018. A key computing challenge is to store and process this data, which limits the maximum output rate of the LHCb trigger. So far, LHCb has written out a few kHz of events containing the full raw sub-detector data, which are passed through a...
Dmytro Kresan
(GSI - Helmholtzzentrum fur Schwerionenforschung GmbH (DE))
4/16/15, 10:15โฏAM
Track1: Online computing
oral presentation
The R3B (Reactions with Rare Radioactive Beams) experiment is one of the planned experiments at the future FAIR facility at GSI Darmstadt. R3B will cover experimental reaction studies with exotic nuclei far off stability, thus enabling a broad physics programs with rare-isotope beams with emphasis on nuclear structure and dynamics. Several different detection subsystems as well as...
Dr
Andrew Norman
(Fermilab)
4/16/15, 11:00โฏAM
Track1: Online computing
oral presentation
The NOvA experiment uses a continuous, free-running, dead-timeless data acquisition system to collect data from the 14 kT far detector. The DAQ system readouts the more than 344,000 detector channels and assembles the information into an raw unfiltered high bandwidth data stream. The NOvA trigger systems operate in parallel to the readout and asynchronously to the primary DAQ readout/event...
Mikolaj Krzewicki
(Johann-Wolfgang-Goethe Univ. (DE))
4/16/15, 11:15โฏAM
Track1: Online computing
oral presentation
The ALICE High Level Trigger (HLT) is an online reconstruction, triggering and data compression system used in the ALICE experiment at CERN. Unique among the LHC experiments, it extensively uses modern coprocessor technologies like general purpose graphic processing units (GPGPU) and field programmable gate arrays (FPGA) in the data flow. Real-time data compression is performed using a cluster...
Manuel Martin Marquez
(CERN)
4/16/15, 11:30โฏAM
Track1: Online computing
oral presentation
Data science is about unlocking valuable insights and obtaining deep knowledge out of the data. Its application enables more efficient daily-based operations and more intelligent decision-making processes. CERN has been very successful on developing custom data-driven control and monitoring systems. Several millions of control devices: sensors, front-end equipment, etc., make up these...
Yu Higuchi
(High Energy Accelerator Research Organization (JP))
4/16/15, 11:45โฏAM
Track1: Online computing
oral presentation
The ATLAS trigger has been used very successfully for the online event
selection during the first run of the LHC between 2009-2013 at a
centre-of-mass energy between 900 GeV and 8 TeV. The trigger system
consists of a hardware Level-1 (L1) and a software based high-level
trigger (HLT) that reduces the event rate from the design
bunch-crossing rate of 40 MHz to an average recording rate of...
Carlo Schiavi
(Universita e INFN Genova (IT))
4/16/15, 12:00โฏPM
Track1: Online computing
oral presentation
Following the successful Run-1 LHC data-taking, the long shutdown gave the opportunity for significant improvements in the ATLAS trigger capabilities, as a result of the introduction of new or improved Level-1 trigger hardware and significant restructuring of the DAQ infrastructure. To make use of these new capabilities, the High-Level trigger (HLT) software has been to a large extent...
Arnim Balzer
(Universiteit van Amsterdam)
4/16/15, 12:15โฏPM
Track1: Online computing
oral presentation
The High Energy Stereoscopic System (H.E.S.S.) is an array of five imaging atmospheric Cherenkov telescopes located in the Khomas Highland in Namibia. Very high energy gamma rays are detected using the Imaging Atmospheric Cherenkov Technique. It separates the Cherenkov light emitted by the background of mostly hadronic air showers from the light emitted by air showers induced by gamma rays....