Alexei Klimentov
(Brookhaven National Laboratory (US)),
Andrey Kirianov
(B.P. Konstantinov Petersburg Nuclear Physics Institute - PNPI (),
Andrey Zarochentsev
(St. Petersburg State University (RU)),
Dimitrii Krasnopevtsev
(National Research Nuclear University MEPhI (RU))
19/01/2016, 14:00
Computing Technology for Physics Research
Oral
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. Computing models for the High Luminosity LHC era anticipate a growth of storage needs of at least orders of magnitudes, it will...
Max Fischer
(KIT - Karlsruhe Institute of Technology (DE))
19/01/2016, 14:25
Computing Technology for Physics Research
Oral
With the increasing data volumes of the second LHC run, analysis groups have to handle unprecedented amounts of data.
This puts many compute clusters relying on network based storage to their limit.
In contrast, data locality based processing enables infrastructure to scale out practically indefinitely.
However, data locality frameworks and infrastructure often add severe constraints and...
Nikita Kazeev
(Yandex School of Data Analysis (RU))
19/01/2016, 14:50
Computing Technology for Physics Research
Oral
Experiments in high energy physics routinely require processing and storing massive amounts of data. LHCb Event Index is an indexing system for high-level event parameters. It’s primary function is to quickly select subsets of events. This paper discusses applications of Event Index to optimization of the data processing pipeline.
The processing and storage capacity is limited and divided...
Dr
Matthias Fuessling
(DESY),
Peter Wegner
(DESY)
19/01/2016, 15:45
Oral
The Cherenkov Telescope Array (CTA), as the next generation ground-based very high-energy gamma-ray observatory, is defining new areas beyond those related to physics; it is also creating new demands on the control and data acquisition system. CTA will consist of two installations, one in each hemisphere, containing tens of telescopes of different sizes. The ACTL (array control and data...
Naoki Kimura
(Aristotle Univ. of Thessaloniki (GR))
19/01/2016, 16:10
Computing Technology for Physics Research
Oral
In the ever increasing pile-up LHC environment advanced techniques of analysing the data are implemented in order to increase the rate of relevant physics processes with respect to background processes.
The Fast TracKer (FTK) is a track finding implementation at hardware level that is designed to deliver full-scan tracks with $p_{T}$ above 1GeV to the ATLAS trigger system for every L1...
Andrei Gheata
(CERN)
19/01/2016, 16:35
Computing Technology for Physics Research
Oral
The GeantV project aims to research and develop the next generation simulation software describing the passage of particles through matter, targeting not only modern CPU architectures, but also more exotic resources such as GPGPU, Intel© Xeon Phi, Atom or ARM, which cannot be ignored any more for HEP computing. While the proof of concept GeantV prototype has been mainly engineered for CPU...
Peter Elmer
(Princeton University (US))
19/01/2016, 17:00
Computing Technology for Physics Research
Oral
High Energy Physics (HEP) is well known as a "Big Data" science, but
it should also be seen as a "Big Software" enterprise. For example, to
support the activities of the Large Hadron Collider at the European
Laboratory for Particle Physics (CERN) tens of millions of lines of code
have been written by thousands of researchers and engineers over the past
20 years.
The wider scientific...