Description
Chairs:
2:00-3:30: Dugan O'NEIL
4:00-6:00: Federico COLECCHIA
-
Mr Jike Wang (High Energy Group-Institute of Physics-Academia Sinica)05/09/2011, 14:00Track 2 : Data Analysis - Algorithms and ToolsParallel talkAtlas is a multipurpose experiment that records the LHC collisions. In order to reconstruct the trajectories of charged particles, ATLAS is equipped with a tracking system built using disticnt technologies: silicon planar sensors (both pixel and microstrips) and drift-tubes (the Inner Detector). The tracking system is embedded in a 2 T solenoidal field. In order to reach the track parameter...Go to contribution page
-
Gero Flucke (DESY (Hamburg))05/09/2011, 14:25Track 2 : Data Analysis - Algorithms and ToolsParallel talkThe CMS all-silicon tracker consists of 16588 modules. In 2010 it has been successfully aligned using tracks from cosmic rays and pp-collisions, following the time dependent movements of its innermost pixel layers. Ultimate local precision is now achieved by the determination of sensor curvatures, challenging the algorithms to determine about 200000 parameters. Remaining alignment...Go to contribution page
-
Dr Paul Laycock (University of Liverpool)05/09/2011, 14:50Track 2 : Data Analysis - Algorithms and ToolsParallel talkOver a decade ago, the H1 Collaboration decided to embrace the object-oriented paradigm and completely redesign its data analysis model and data storage format. The event data model, based on the RooT framework, consists of three layers - tracks and calorimeter clusters, identified particles and finally event summary data - with a singleton class providing unified access. This...Go to contribution page
-
Dr Federico Carminati (CERN)05/09/2011, 15:15Track 2 : Data Analysis - Algorithms and ToolsParallel talkMonte-Carlo technique enables one to generate random samples from distributions with known characteristics and helps to make probability based inferences of the underlying physical processes. Fast and efficient Monte-Carlo particle transport code particularly for high energy nuclear and particle physics experiments has become an important tool starting from the design and fabrication of...Go to contribution page
-
Mr Matteo Agostini (Munich Technical University)05/09/2011, 16:05Track 2 : Data Analysis - Algorithms and ToolsParallel talkWe present the concept, the implementation and the performance of a new software framework developed to provide a flexible and user-friendly environment for advanced analysis and processing of digital signals. The software has been designed to handle the full data analysis flow of GERDA, a low-background experiment which searches for the neutrinoless double beta decay of Ge-76 by using...Go to contribution page
-
Dr Manqi Ruan (Laboratoire Leprince-Ringuet (LLR)-Ecole Polytechnique)05/09/2011, 16:30Track 2 : Data Analysis - Algorithms and ToolsParallel talkThe concept of "particle flow" has been developed to optimise jet energy resolution by best separating the different components of hadronic jets. A highly granular calorimetry is mandatory and provides an unprecedented level of detail in the reconstruction of showers. This enables new approaches to shower analysis. Here the measurement and use of of showers' fractal dimension is described....Go to contribution page
-
Robert Fischer (RWTH Aachen University, III. Physikalisches Institut A)05/09/2011, 16:55Track 2 : Data Analysis - Algorithms and ToolsParallel talkVisual Physics Analysis (VISPA) is an analysis development environment with applications in high energy as well as astroparticle physics. VISPA provides a graphical steering of the analysis flow, which is comprised of self-written C++ and Python modules. The advances presented in this talk extend the scope from prototyping to the execution of analyses. A novel concept of analysis layers has...Go to contribution page