Tatsiana Klimkovich
(RWTH-Aachen)
04/11/2008, 14:00
2. Data Analysis
Parallel Talk
VISPA is a novel graphical development environment for physics analysis, following an experiment-independent approach. It introduces a new way of steering a physics data analysis, combining graphical and textual programming. The purpose is to speed up the design of an analysis, and to facilitate its control.
As the software basis for VISPA the C++ toolkit Physics eXtension Library (PXL) is...
Alexandre Vaniachine
(Argonne National Laboratory)
04/11/2008, 14:25
2. Data Analysis
Parallel Talk
HEP experiments at the LHC store petabytes of data in ROOT files described with TAG metadata. The LHC experiments have challenging goals for efficient access to this data. Physicists need to be able to compose a metadata query and rapidly retrieve the set of matching events. Such skimming operations will be the first step in the analysis of LHC data, and improved efficiency will facilitate the...
Anna Kreshuk
(GSI)
04/11/2008, 14:50
2. Data Analysis
Parallel Talk
This presentation discusses activities at GSI to support interactive data analysis for the LHC experiment ALICE. GSI is a tier-2 centre for ALICE. One focus is a setup where it is possible to dynamically switch the resources between jobs from the Grid, jobs from the local batch system and the GSI Analysis Facility (GSIAF), a PROOF farm for fast interactive analysis. The second emphasis is on...
Axel Naumann
(CERN)
04/11/2008, 15:15
2. Data Analysis
Parallel Talk
High performance computing with a large code base and C++ has proved to be a
good combination. But when it comes to storing data, C++ is a really bad choice: it offers no support for serialization, type definitions are amazingly complex to parse, and the dependency analysis (what does object A need to be stored?) is incredibly difficult. Nevertheless, the LHC data consists of C++ objects that...
Mr
John Alison
(Department of Physics and Astronomy, University of Pennsylvania)
04/11/2008, 16:10
2. Data Analysis
Parallel Talk
The CERN's Large Hadron Collider (LHC) is the world largest particle accelerator. It will collide two proton beams at an unprecedented center of mass energy of 14 TeV and first colliding beams are expected during summer 2008. ATLAS is one of the two general purpose experiments that will record the decay products of the proton-proton collisions. ATLAS is equipped with a charge particle...
Dr
Lorenzo Moneta
(CERN)
04/11/2008, 16:35
2. Data Analysis
Parallel Talk
Advanced mathematical and statistical computational methods are required by the LHC experiments for analyzing their data. Some of these methods are provided by the Math work package of the ROOT project, a C++ Object Oriented framework for large scale data handling applications.
We present in detail the recent developments of this work package, in particular the recent improvements in the...
Ms
Sonia Khatchadourian
(ETIS - UMR CNRS 8051)
04/11/2008, 17:00
2. Data Analysis
Parallel Talk
The HESS project is a major international experiment currently performed
in gamma astronomy. This project relies on a system of four Cherenkov
telescopes enabling the observation of cosmic gamma rays. The
outstanding performance obtained so far in the HESS experiment has led
the research labs involved in this project to improve the existing
system: an additional telescope is currently...
Miroslav Morhac
(Institute of Physics, Slovak Academy of Sciences)
04/11/2008, 17:25
2. Data Analysis
Parallel Talk
The accuracy and reliability of the analysis of spectroscopic data depend critically on the treatment in order to resolve strong peak overlaps, to account for continuum background contributions, and to distinguish artifacts to the responses of some detector types. Analysis of spectroscopic data can be divided to
1. estimation of peaks positions (peak searching)
2. fitting of peak...