Conveners
Software Components, Tools and Databases: Monday
- Julius Hrivnac (LAL, Orsay)
Software Components, Tools and Databases: Monday
- Paolo Calafiura (LBNL, Berkeley)
Software Components, Tools and Databases: Tuesday
- Maria Girone (CERN)
Software Components, Tools and Databases: Tuesday
- Maria Grazia Pia (INFN)
Software Components, Tools and Databases: Thursday
- There are no conveners in this block
Software Components, Tools and Databases: Thursday
- Pere Mato (CERN)
Software Components, Tools and Databases: Thursday
- Paolo Calafiura (LBNL, Berkeley)
Dr
Zhen Xie
(Princeton University)
23/03/2009, 14:00
Software Components, Tools and Databases
oral
Non-event data describing detector conditions change with time and
come from different data sources. They are accessible by physicists
within the offline event-processing applications for precise calibration of reconstructed data as well as for data-quality control purposes.
Over the past three years CMS has developed and deployed a software
system managing such data. Object-relational...
Alexandre Vaniachine
(Argonne),
Rodney Walker
(LMU Munich)
23/03/2009, 14:20
Software Components, Tools and Databases
oral
During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic workflow, ATLAS database scalability tests provided feedback for Conditions DB software optimization and allowed precise determination of required distributed database resources. In distributed...
Andrea Valassi
(CERN)
23/03/2009, 14:40
Software Components, Tools and Databases
oral
The LCG Persistency Framework consists of three software packages (POOL, CORAL and COOL) that address the data access requirements of the LHC experiments in several different areas. The project is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that are using some or all of the Persistency Framework components to access their data....
Dr
Maria Girone
(CERN)
23/03/2009, 15:00
Software Components, Tools and Databases
oral
Originally deployed at CERN for the construction of LEP, relational databases now play a key role in the experiments' production chains, from online acquisition through to offline production, data distribution, reprocessing and analysis. They are also a fundamental building block for the Tier0 and Tier1 data management services. We summarize the key requirements in terms of availability,...
Dr
Andrea Valassi
(CERN)
23/03/2009, 15:20
Software Components, Tools and Databases
oral
The CORAL package is the CERN LCG Persistency Framework common relational database abstraction layer for accessing the data of the LHC experiments that is stored using relational database technologies.
A traditional two-tier client-server model is presently used by most CORAL applications accessing relational database servers such as Oracle, MySQL, SQLite.
A different model, involving a...
Dr
David Malon
(Argonne National Laboratory), Dr
Elizabeth Gallas
(University of Oxford)
23/03/2009, 15:40
Software Components, Tools and Databases
oral
Metadata--data about data--arise in many contexts, from many diverse sources,
and at many levels in ATLAS.
Familiar examples include run-level, luminosity-block-level, and event-level metadata, and,
related to processing and organization, dataset-level and file-level metadata,
but these categories are neither exhaustive nor orthogonal.
Some metadata are known a priori, in advance of...
Dr
Jack Cranshaw
(Argonne National Laboratory), Dr
Qizhi Zhang
(Argonne National Laboratory)
23/03/2009, 16:30
Software Components, Tools and Databases
oral
ATLAS has developed and deployed event-level selection services based upon event metadata records ("tags")
and supporting file and database technology.
These services allow physicists to extract events that satisfy their selection predicates from any stage
of data processing and use them as input to later analyses.
One component of these services is a web-based Event-Level Selection...
David Lawrence
(Jefferson Lab)
23/03/2009, 16:50
Software Components, Tools and Databases
oral
Calibrations and conditions databases can be accessed from within the JANA Event Processing framework through the API defined in its JCalibration base class. This system allows constants to be retrieved through a single line
of C++ code with most of the context implied by the run currently being analyzed. The API is designed to support everything from databases, to web
services to flat files...
Dr
Ilse Koenig
(GSI Darmstadt)
23/03/2009, 17:10
Software Components, Tools and Databases
oral
Since 2002 the HADES experiment at GSI employs an Oracle database for storing of all parameters relevant for simulation and data analysis. The implementation features a flexible, multi-dimensional and easy-to-use version management. Direct interfaces to the ROOT-based analysis and simulation framework HYDRA allow for an automated initialization based on actual or historic data which is needed...
Barbara Martelli
(INFN)
23/03/2009, 17:30
Software Components, Tools and Databases
oral
The LCG File Catalog (LFC) is a key component of the LHC Computing Grid (LCG) middleware, as it contains the mapping between all logical and physical file names on the Grid. The Atlas computing model foresees multiple local LFC hosted in each Tier-1 and Tier-0, containing all information about files stored in that cloud. As the local LFC contents are presently not replicated, this turns out in...
Dr
Shaun Roe
(CERN)
23/03/2009, 17:50
Software Components, Tools and Databases
oral
The COOL database in ATLAS is primarily used for storing detector conditions data, but also status flags which are uploaded summaries of information to indicate the detector reliability during a run. This paper introduces the use of CherryPy, a Python application server which acts as an intermediate layer between a web interface and the database, providing a simple means of storing to and...
Andressa Sivolella Gomes
(Universidade Federal do Rio de Janeiro (UFRJ))
23/03/2009, 18:10
Software Components, Tools and Databases
oral
The ATLAS detector consists of four major components: inner tracker, calorimeter, muon
spectrometer and magnet system. In the Tile Calorimeter (TileCal), there are 4 partitions, each partition
has 64 modules and each module has up to 48 channels. During the ATLAS commissioning phase, a
group of physicists need to analyze the Tile Calorimeter data quality, generate reports and update...
Dr
Shaun Roe
(CERN)
24/03/2009, 14:00
Software Components, Tools and Databases
oral
The combination of three relatively recent technologies is described which allows an easy path from database retrieval to interactive web display. SQL queries on an Oracle database can be performed in a manner which directly return an XML description of the result, and Ajax techniques (Asynchronous Javascript And XML) are used to dynamically inject the data into a web display accompanied by an...
Andreas Hinzmann
(RWTH Aachen University)
24/03/2009, 14:20
Software Components, Tools and Databases
oral
The job configuration system of the CMS experiment is based on the Python programming language. Software modules and their order of execution are both represented by Python objects. In order to investigate and verify configuration parameters and dependencies naturally appearing in modular software, CMS employs a graphical tool. This tool visualizes the configuration objects, their...
Benedikt Hegner
(CERN)
24/03/2009, 14:40
Software Components, Tools and Databases
oral
Being a highly dynamic language and allowing reliable programming with quick turnarounds, Python is a widely used programming language in CMS. Most of the tools used in workflow management and the GRID interface tools are written in this language. Also most of the tools used in the context of release management: integration builds, release building and deploying, as well as performance...
Dr
Pere Mato
(CERN)
24/03/2009, 15:00
Software Components, Tools and Databases
oral
GAUDI is a software framework in C++ used to build event data processing applications using a set of standard components with well-defined interfaces. Simulation, high-level trigger, reconstruction, and analysis programs used by several experiments are developed using GAUDI. These applications can be configured and driven by simple Python scripts. Given the fact that a considerable amount of...
Dr
Sebastien Binet
(LBNL)
24/03/2009, 15:20
Software Components, Tools and Databases
oral
Computers are no longer getting faster: instead, they are growing more and more
CPUs, each of which is no faster than the previous generation.
This increase in the number of cores evidently calls for more parallelism in
HENP software.
If end-users' stand-alone analysis applications are relatively easy to modify,
LHC experiments frameworks, being mostly written with a single 'thread'...
Mr
Alexander Zaytsev
(Budker Institute of Nuclear Physics (BINP))
24/03/2009, 15:40
Software Components, Tools and Databases
oral
Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode.
The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3...
Marco Clemencic
(European Organization for Nuclear Research (CERN))
24/03/2009, 16:30
Software Components, Tools and Databases
oral
After ten years from its first version, the Gaudi software framework underwent many changes and improvements with a subsequent increased of the code base. Those changes were almost always introduced preserving the backward compatibility and reducing as much as possible changes in the framework itself; obsolete code has been removed only rarely. After a release of Gaudi targeted to the...
Mr
Frank van Lingen
(California Institute of Technology), Mr
Stuart Wakefield
(Imperial College)
24/03/2009, 16:50
Software Components, Tools and Databases
oral
Three different projects within CMS produce various workflow related data products: CRAB (analysis centric), ProdAgent (simulation production centric), T0 (real time sorting and reconstruction of real events). Although their data products and workflows are different, they all deal with job life cycle management (creation, submission, tracking, and cleanup of jobs). WMCore provides a set of...
Peter Onyisi
(University of Chicago)
24/03/2009, 17:10
Software Components, Tools and Databases
oral
The ATLAS experiment at the Large Hadron Collider reads out 100 Million
electronic channels at a rate of 200 Hz.
Before the data are shipped to storage and analysis centres across the
world, they have to be checked to be free from irregularities which
render them scientifically useless. Data quality offline monitoring
provides prompt feedback from full first-pass event reconstruction...
Zachary Miller
(University of Wisconsin)
24/03/2009, 17:30
Software Components, Tools and Databases
oral
Many secure communication libraries used by distributed systems, such as SSL,
TLS, and Kerberos, fail to make a clear distinction between the authentication,
session, and communication layers. In this paper we introduce CEDAR, the secure
communication library used by the Condor High Throughput Computing software,
and present the advantages to a distributed computing system resulting...
Dr
Rene Brun
(CERN)
24/03/2009, 17:50
Software Components, Tools and Databases
oral
In the last few years ROOT has continued to consolidate and improve the
existing code base and infrastructure. This includes a very smooth transition
to SVN that subsequently enabled us to reorganize the existing libraries into
semantic packages, which in turn help in improving the documentation.
We also continued to improvement performance and reduce memory footprint for
example...
David Gonzalez Maline
(CERN)
24/03/2009, 18:10
Software Components, Tools and Databases
oral
ROOT, as a scientific data analysis framework, provides extensive capabilities
via graphics user interfaces (GUI) for performing interactive analysis and
visualize data objects like histograms and graphs. A new interface for fitting
has been developed for performing, exploring and comparing fits on data point
sets such as histograms, multi-dimensional graphs or trees.
With this new...
Predrag Buncic
(CERN)
26/03/2009, 14:00
Software Components, Tools and Databases
oral
CernVM is a Virtual Software Appliance to run physics applications from the LHC experiments at CERN. The virtual appliance provides a complete, portable and easy to install and configure user environment for developing and running LHC data analysis on any end-user computer (laptop, desktop) and on the Grid independently of operating system software and hardware platform (Linux, Windows,...
Peter Hristov
(CERN)
26/03/2009, 14:00
Software Components, Tools and Databases
oral
Since 1998 ALICE is developing the AliRoot framework for Offline computing. This talk will critically review the development and present status of the framework. The current functionality for simulation, reconstruction, alignment, calibration and analysis will be described and commented. The integration with the Grid and the Proof systems will be described and discussed.
The talk will also...
Dr
Andrea Chierici
(INFN-CNAF)
26/03/2009, 14:20
Software Components, Tools and Databases
oral
Virtualization is a proven software technology that is rapidly transforming the IT landscape and fundamentally changing the way that people compute. Recently all major software producers (e.g. Microsoft and RedHat) developed or acquired virtualization technologies.
Our institute is a Tier1 for LHC experiments and is experiencing lots of benefits from virtualization technologies, like...
Dr
Johan Messchendorp (for the PANDA collaboration)
(University of Groningen)
26/03/2009, 14:20
Software Components, Tools and Databases
oral
The Panda experiment at the future facility FAIR will provide valuable data for our
present understanding of the strong interaction. In preparation for the experiments,
large-scale simulations for design and feasibility studies are performed exploiting a new
software framework, Fair/PandaROOT, which is based on ROOT and the Virtual Monte Carlo
(VMC) interface. In this paper, the various...
Dr
Maria Grazia Pia
(INFN GENOVA)
26/03/2009, 14:40
Software Components, Tools and Databases
oral
Geant4 is nowadays a mature Monte Carlo system; new functionality has been extensively added to the toolkit since its first public release in 1998, nevertheless, its architectural design and software technology features have remained substantially unchanged since their original conception in the RD44 phase of the mid โ90s.
A R&D project has been recently launched at INFN to revisit Geant4...
Dr
Yushu Yao
(LBNL)
26/03/2009, 14:40
Software Components, Tools and Databases
oral
ATLAS software has been developed mostly on CERN linux cluster
lxplus[1] or on similar facilities at the experiment Tier 1 centers. The
fast rise of virtualization technology has the potential to change this
model, turning every laptop or desktop into an ATLAS analysis platform. In
the context of the CernVM project[2] we are developing a suite of tools and
CernVM plug-in extensions to...
Dr
Mohammad Al-Turany
(GSI DARMSTADT)
26/03/2009, 15:00
Software Components, Tools and Databases
oral
FairRoot is the simulation and analysis framework used by CBM and PANDA experiments at FAIR/GSI.
The use of GPU's for event reconstruction in FairRoot will be presented. The fact that CUDA (Nvidia's Compute Unified Device Architecture) development tools work alongside the conventional C/C++ compiler, makes it possible to mix GPU code with general-purpose code for the host CPU, based on...
Wolfgang Ehrenfeld
(DESY)
26/03/2009, 15:00
Software Components, Tools and Databases
oral
The ATLAS trigger system is responsible for selecting the interesting collision events delivered by the Large Hadron Collider (LHC). The ATLAS trigger will need to achieve a ~10-7 rejection factor against random proton-proton collisions, and still be able to efficiently select interesting events. After a first processing level based on hardware, the final event selection is based on custom...
Dr
Hans wenzel
(Fermilab), Dr
Marian Zvada
(Fermilab)
26/03/2009, 15:20
Software Components, Tools and Databases
oral
We will present the monitoring system for the analysis farm of the CDF
experiment at the Tevatron (CAF). All monitoring data is collected in a
relational database (PostgreSQL), with SQL providing a common interface to the monitoring data.
The display of these monitoring data is done with a Web Application in form
of Java Server pages served by the Apache Tomcat server.
For the database...
Dr
Brinick Simmons
(Department of Physics and Astronomy - University College London)
26/03/2009, 15:20
Software Components, Tools and Databases
oral
The ATLAS experiment's RunTimeTester (RTT) is a software testing
framework into which software package developers can plug their tests,
have them run automatically, and obtain feedback via email and the web.
The RTT processes the ATLAS nightly build releases, using acron to launch runs
on a dedicated cluster at CERN, and submitting user jobs to private LSF
batch queues. Running higher...
Dr
Stefan Roiser
(CERN)
26/03/2009, 15:40
Software Components, Tools and Databases
oral
The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of "external" software packages (~ 70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing,...
Axel Naumann
(CERN)
26/03/2009, 16:30
Software Components, Tools and Databases
oral
ROOT is planning to replace a large part of its C++ interpreter CINT. The new implementation will be based on the LLVM compiler infrastructure. LLVM is developed among others by Apple, Adobe, the university of Illinois at Urbana-Champaign; it is open source. Once available, LLVM will offer an ISO compliant C++ parser, a bytecode generator and execution engine, a just-in-time-compiler, and...
Fred Luehring
(Indiana University)
26/03/2009, 16:50
Software Components, Tools and Databases
oral
We update our CHEP06 presentation on the ATLAS experiment software
infrastructure used to build, validate, distribute, and document the ATLAS
offline software. The ATLAS collaboration's computational resources and
software developers are distributed around the globe in more then 30 counties.
The ATLAS offline code base is currently over 5 MSLOC in 10000+ C++
classes organized into about...
David Lange
(LLNL)
26/03/2009, 17:10
Software Components, Tools and Databases
oral
The offline software suite of the Compact Muon Solenoid (CMS) experiment must support the production and analysis activities across the distributed computing environment developed by the LHC experiments. This system relies on over 100 external software packages and includes the developments of hundreds of active developers. The applications of this software require consistent and rapid...
Mr
Dmitri Konstantinov
(IHEP Protvino)
26/03/2009, 17:30
Software Components, Tools and Databases
oral
The Generator Services project collaborates with the Monte Carlo
generators authors and with the LHC experiments in order to prepare
validated LCG compliant code for both the theoretical and the
experimental communities at the LHC. On the one side it provides the
technical support as far as the installation and the maintenance of
the generators packages on the supported platforms is...
Mr
Andrzej Nowak
(CERN)
26/03/2009, 17:50
Software Components, Tools and Databases
oral
At CHEP2007 we reported on the perfmon2 subsystem as a tool for interfacing to the PMUs (Performance Monitoring Units) which are found in the hardware of all modern processors (from AMD, Intel, SUN, IBM, MIPS, etc.).
The intent was always to get the subsystem into the Linux kernel by default. The talk will report on how progress is now being made (after long discussions) and also show the...
Robert Petkus
(Brookhaven National Laboratory)
26/03/2009, 18:10
Software Components, Tools and Databases
oral
Robust, centralized system and application logging services are vital to all computing organizations, regardless of size. For the past year, the RHIC/USATLAS Computing Facility (RACF) has dramatically augmented the utility of logging services with Splunk. Splunk is a powerful application that functions as a log search engine, providing fast, real-time access to data from servers,...