Conveners
Track 2 Session: #1 (Reconstruction)
- Elizabeth Sexton-Kennedy (Fermi National Accelerator Lab. (US))
Track 2 Session: #2 (Exp. Comp. Models)
- Ivan Kisel (Johann-Wolfgang-Goethe Univ. (DE))
Track 2 Session: #3 (Frameworks)
- Andrew Norman (Fermilab)
Track 2 Session: #4 (Simulation)
- Andrea Dell'Acqua (CERN)
Track 2 Session: #5 (Analysis)
- Andrew Norman (Fermilab)
Track 2 Session: #6 (Tools)
- Elizabeth Sexton-Kennedy (Fermi National Accelerator Lab. (US))
Description
Offline software
Presentation materials
There are no materials yet.
Andreas Salzburger
(CERN)
13/04/2015, 14:00
Track2: Offline software
oral presentation
Track reconstruction is one of the most complex elements of the reconstruction of events recorded by ATLAS from collisions delivered by the LHC. It is the most time consuming reconstruction component in high luminosity environments. After a hugely successful Run-1, the flat budget projections for computing resources for Run-2 of the LHC together with the demands of reconstructing higher...
David Lange
(Lawrence Livermore Nat. Laboratory (US))
13/04/2015, 14:15
Track2: Offline software
oral presentation
Over the past several years, the CMS experiment has made significant changes to its detector simulation and reconstruction applications motivated by the planned program of detector upgrades over the next decade. These upgrades include both completely new tracker and calorimetry systems and changes to essentially all major detector components to meet the requirements of very high pileup...
Marco Rovere
(CERN)
13/04/2015, 14:30
Track2: Offline software
oral presentation
The CMS tracking code is organized in several levels, known as 'iterative steps', each optimized to reconstruct a class of particle trajectories, as the ones of particles originating from the primary vertex or displaced tracks from particles resulting from secondary vertices. Each iterative step consists of seeding, pattern recognition and fitting by a kalman filter, and a final filtering and...
Barbara Storaci
(Universitaet Zuerich (CH))
13/04/2015, 14:45
Track2: Offline software
oral presentation
The LHCb track reconstruction uses sophisticated pattern recognition algorithms to reconstruct trajectories of charged particles. Their main feature is the use of a Hough-transform like approach to connect track segments from different subdetectors, allowing for having no tracking stations in the magnet of LHCb. While yielding a high efficiency, the track reconstruction is a major contributor...
Dominick Rocco
(urn:Google)
13/04/2015, 15:00
Track2: Offline software
oral presentation
The NOvA experiment is a long baseline neutrino oscillation experiment utilizing the NuMI beam generated at Fermilab. The experiment will measure the oscillations within a muon neutrino beam in a 300 ton Near Detector located underground at Fermilab and a functionally-identical 14 kiloton Far Detector placed 810 km away. The detectors are liquid scintillator tracking calorimeters with a...
Giuseppe Cerati
(Univ. of California San Diego (US))
13/04/2015, 15:15
Track2: Offline software
oral presentation
Power density constraints are limiting the performance improvements of modern CPUs. To address this we have seen the introduction of lower-power, multi-core processors, but the future will be even more exciting. In order to stay within the power density limits but still obtain Moore's Law performance/price gains, it will be necessary to parallelize algorithms to exploit larger numbers of...
Oliver Frost
(DESY)
13/04/2015, 15:30
Track2: Offline software
oral presentation
With the upgraded electron-positron-collider facility, SuperKEKB and Belle II, the Japanese high energy research center KEK strives to exceed its own world record luminosity by a factor of 40.
To provide a solid base for the event reconstruction within the central drift chamber in the enhanced luminosity setup, a powerful track finding algorithm coping with the higher beam induced backgrounds...
Dr
Ivan Kisel
(Johann-Wolfgang-Goethe Univ. (DE))
13/04/2015, 15:45
Track2: Offline software
oral presentation
The future heavy-ion experiment CBM (FAIR/GSI, Darmstadt, Germany) will focus on the measurement of very rare probes at interaction rates up to 10 MHz with data flow of up to 1 TB/s. The beam will provide free stream of beam particles without bunch structure. That requires full online event reconstruction and selection not only in space, but also in time, so-called 4D event building and...
Rolf Seuster
(TRIUMF (CA))
13/04/2015, 16:30
Track2: Offline software
oral presentation
The talk will give a summary of the broad spectrum of software upgrade projects to prepare ATLAS for the challenges of the soon coming LHC Run-2. Those projects include the reduction of the CPU required for reconstruction by a factor 3 compared to 2012, which was required to meet the challenges of the expected increase in pileup and the higher data taking rate of up to 1 kHz. As well, the new...
Scott Snyder
(Brookhaven National Laboratory (US))
13/04/2015, 16:45
Track2: Offline software
oral presentation
During the 2013-2014 shutdown of the Large Hadron Collider, ATLAS switched to a new event data model for analysis, called the xAOD. A key feature of this model is the separation of the object data from the objects themselves (the `auxiliary store'). Rather being stored as member variables of the analysis classes, all object data are stored separately, as vectors of simple values. Thus, the...
Dr
Peter Van Gemmeren
(Argonne National Laboratory (US))
13/04/2015, 17:00
Track2: Offline software
oral presentation
ATLAS developed and employed for Run 1 of the Large Hadron Collider a sophisticated infrastructure for metadata handling in event processing jobs. This infrastructure profits from a rich feature set provided by the ATLAS execution control framework, including standardized interfaces and invocation mechanisms for tools and services, segregation of transient data stores with concomitant object...
Marco Rovere
(CERN)
13/04/2015, 17:15
Track2: Offline software
oral presentation
The Data Quality Monitoring (DQM) Software is a central tool in the CMS experiment. Its flexibility allows for integration in several key environments: Online, for real-time detector monitoring; Offline, for the final, fine-grained data analysis and certification; Release-Validation, to constantly validate the functionalities and the performance of the reconstruction software; in Monte Carlo...
Dr
Carl Vuosalo
(University of Wisconsin (US))
13/04/2015, 17:30
Track2: Offline software
oral presentation
The CMS experiment has developed a new analysis object format (the "mini-AOD") targeted to be less than 10% of the size of the Run 1 AOD format. The motivation for the Mini-AOD format is to have a small and quickly derived data format from which the majority of CMS analysis users can perform their analysis work. This format is targeted at having sufficient information to serve about 80% of CMS...
Janusz Martyniak
(Imperial College London)
13/04/2015, 17:45
Track2: Offline software
oral presentation
The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for offline batch simulation and reconstruction as well as online data quality checks . The software provides both traditional particle physics functionalities such as track reconstruction...
Adam Aurisano
(University of Cincinnati)
13/04/2015, 18:00
Track2: Offline software
oral presentation
The NOvA experiment is a two-detector, long-baseline neutrino experiment operating in the recently upgraded NuMI muon neutrino beam. Simulating neutrino interactions and backgrounds requires many steps including: the simulation of the neutrino beam flux using FLUKA and the FLUGG interface; cosmic ray generation using CRY; neutrino interaction modeling using GENIE; and a simulation of the...
Norman Anthony Graf
(SLAC National Accelerator Laboratory (US))
13/04/2015, 18:15
Track2: Offline software
oral presentation
The Heavy Photon Search (HPS) is an experiment at the Thomas Jefferson National Accelerator Facility (JLab) designed to search for a hidden sector photon (Aโ) in fixed target electroproduction. It uses a silicon microstrip tracking and vertexing detector inside a dipole magnet to measure charged particle trajectories and a fast electromagnetic calorimeter just downstream of the magnet to...
Vakho Tsulaia
(Lawrence Berkeley National Lab. (US))
14/04/2015, 14:00
Track2: Offline software
oral presentation
AthenaMP is a multi-process version of the ATLAS reconstruction and data analysis framework Athena. By leveraging Linux fork and copy-on-write, it allows the sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated...
James Catmore
(University of Oslo (NO))
14/04/2015, 14:15
Track2: Offline software
oral presentation
During the Long shutdown of the LHC, the ATLAS collaboration overhauled its analysis model based on experience gained during Run 1. A significant component of the model is a "Derivation Framework" that takes the Petabyte-scale AOD output from ATLAS reconstruction and produces samples, typically Terabytes in size, targeted at specific analyses. The framework incorporates all of the...
Dr
Sami Kama
(Southern Methodist University (US))
14/04/2015, 14:30
Track2: Offline software
oral presentation
The challenge faced by HEP experiments from the current and expected architectural evolution of CPUs and co-processors is how to successfully exploit concurrency and keep memory consumption within reasonable limits. This is a major change from frameworks which were designed for serial event processing on single core processors in the 2000s. ATLAS has recently considered this problem in some...
Charles Leggett
(Lawrence Berkeley National Lab. (US))
14/04/2015, 14:45
Track2: Offline software
oral presentation
The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in...
Dr
Christopher Jones
(Fermi National Accelerator Lab. (US))
14/04/2015, 15:00
Track2: Offline software
oral presentation
During 2014, the CMS Offline and Computing Organization completed the necessary changes to use the CMS threaded framework in the full production environment. Running reconstruction workflows using the multi-threaded framework is a crucial element of CMS' 2015 and beyond production plan. We will briefly discuss the design of the CMS Threaded Framework, in particular how the design affects...
Dr
Florian Uhlig
(GSI Darmstadt)
14/04/2015, 15:15
Track2: Offline software
oral presentation
The FairRoot framework is the standard framework for simulation, reconstruction and data analysis developed at GSI for the future experiments at the FAIR facility.
The framework delivers base functionality for simulation, i.e.: Infrastructure to easily implement a set of detectors, fields, and event generators. Moreover, the framework decouples the user code (e.g.: Geometry description,...
Dr
Mohammad Al-Turany
(CERN)
14/04/2015, 15:30
Track2: Offline software
oral presentation
The commonalities between the ALICE and FAIR experiments and their computing requirements lead to the development of large parts of a common software framework in an experiment independent way. The FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities. The ALFA framework is a joint...
marko staric
(J. Stefan Institute, Ljubljana, Slovenia)
14/04/2015, 15:45
Track2: Offline software
oral presentation
We present software framework being developed for physics analyses using
the data collected by the Belle II experiment. The analysis workflow is
organized in a modular way integrated within the Belle II software framework
(BASF2). A set
of physics analysis modules that perform simple and well defined tasks
and are common to almost all physics analyses are provided. The physics
modules do...
Dr
Makoto Asai
(SLAC National Accelerator Laboratory (US))
14/04/2015, 16:30
Track2: Offline software
oral presentation
The Geant4 Collaboration released a new generation of the Geant4
simulation toolkit (version 10.0) in December 2013, and continues
to improve its physics, computing performance and usability. This
presentation will cover the major improvements made since version
10.0. The physics evolutions include improvement of the Fritiof
hadronics model, extension of the INCL++ model to higher...
Ivana Hrivnacova
(IPNO, Universitรฉ Paris-Sud, CNRS/IN2P3)
14/04/2015, 16:45
Track2: Offline software
oral presentation
Virtual Monte Carlo (VMC) provides an abstract interface into Monte Carlo transport codes. A user VMC based application, independent from the specific Monte Carlo codes, can be then run with any of the supported simulation programs. Developed by the ALICE Offline Project and further included in ROOT, the interface and implementations have reached stability during the last decade and have...
Norman Anthony Graf
(SLAC National Accelerator Laboratory (US))
14/04/2015, 17:00
Track2: Offline software
oral presentation
As the complexity and resolution of particle detectors increases, the need for detailed simulation of the experimental setup also increases. We have developed efficient and flexible tools for detailed physics and detector response simulations which build on the power of the Geant4 toolkit but free the end user from any C++ coding. Geant4 is the de facto high-energy physics standard for...
Mr
Federico Carminati
(CERN)
14/04/2015, 17:15
Track2: Offline software
oral presentation
Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements....
David Lange
(Lawrence Livermore Nat. Laboratory (US))
14/04/2015, 17:30
Track2: Offline software
oral presentation
This presentation will discuss new features of the CMS simulation for Run 2, where we have made considerable improvements during LHC shutdown to deal with the increased event complexity and rate for Run 2. For physics improvements migration from Geant4 9.4p03 to Geant4 10.0p02 has been performed. CPU performance was improved by introduction of the Russian roulette method inside CMS...
Gaelle Boudoul
(Universite Claude Bernard-Lyon I (FR))
14/04/2015, 17:45
Track2: Offline software
oral presentation
CMS Detector Description (DD) is an integral part of the CMSSW software multithreaded framework. CMS software has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the limitations of the Run I DD model and changes implemented for the restart of the LHC program in 2015....
Dr
Tobias Stockmanns
(FZ Jรผlich GmbH)
14/04/2015, 18:00
Track2: Offline software
oral presentation
Future particle physics experiments are searching more and more for rare decays which have similar signatures in the detector as the huge background. For those events usually simple selection criteria do not exist, which makes it impossible to implement a hardware-trigger based on a small subset of detector data.
Therefore all the detector data is read out continuously and processed...
Sameh Mannai
(Universite Catholique de Louvain (UCL) (BE))
14/04/2015, 18:15
Track2: Offline software
oral presentation
The Semi-Digital Hadronic CALorimeter(SDHCAL) using Glass Resistive Plate Chambers (GRPCs) is
one of the two hadronic calorimeter options proposed by the ILD (International Large Detector) project for the future (ILC) International Linear Collider experiments.
It is a sampling calorimeter with 48 layers. Each layer has a size of 1 mยฒ and
finely segmented into cells of 1 cmยฒ ensuring a...
Glen Cowan
(Royal Holloway, University of London)
16/04/2015, 09:00
Track2: Offline software
oral presentation
High Energy Physics has been using Machine Learning techniques (commonly known as Multivariate Analysis) since the 1990s with Artificial Neural Net and more recently with Boosted Decision Trees, Random Forest etc. Meanwhile, Machine Learning has become a full blown field of computer science. With the emergence of Big Data, data scientists are developing new Machine Learning algorithms to...
Gloria Corti
(CERN)
16/04/2015, 09:15
Track2: Offline software
oral presentation
In the LHCb experiment all massive processing of data is handled centrally. In the case of simulated data a wide variety of different types of Monte Carlo (MC) events has to be produced, as each physicsโ analysis needs different sets of signal and background events. In order to cope with this large set of different types of MC events, of the order of several hundreds, a numerical event type...
Dominick Rocco
(urn:Google)
16/04/2015, 09:30
Track2: Offline software
oral presentation
In this paper we present the Library Event Matching (LEM) classification technique for particle identification. The LEM technique was developed for the NOvA electron neutrino appearance analysis as an alternative but complimentary approach to standard multivariate methods. Traditional multivariate PIDs are based on high-level reconstructed quantities which can obscure or discard important...
Stefan Gadatsch
(NIKHEF (NL))
16/04/2015, 09:45
Track2: Offline software
oral presentation
In particle physics experiments data analyses generally use Monte Carlo (MC) simulation templates to interpret the observed data. These simulated samples may depend on one or multiple model parameters, such as a shifting mass parameter, and a set of such samples may be required to scan over the various parameter values. Since detailed detector MC simulation can be time-consuming, there is...
Alessandro Manzotti
(The University of Chicago)
16/04/2015, 10:00
Track2: Offline software
oral presentation
CosmoSIS [http://arxiv.org/abs/1409.3409] is a modular system for
cosmological parameter estimation, based on Markov Chain Monte Carlo
(MCMC) and related techniques. It provides a series of samplers, which
drive the exploration of the parameter space, and a series of modules,
which calculate the likelihood of the observed data for a given physical
model, determined by the location of a...
Dr
Frederik Beaujean
(LMU Munich)
16/04/2015, 10:15
Track2: Offline software
oral presentation
The Bayesian analysis toolkit (BAT)
is a C++ package centered around Markov-chain Monte Carlo sampling. It
is used in analyses of various particle-physics experiments such as
ATLAS and Gerda. The software has matured over the last few years to a
version 1.0. We will summarize the lessons learned and report on the
current developments of a complete redesign...
Lorenzo Moneta
(CERN)
16/04/2015, 11:00
Track2: Offline software
oral presentation
ROOT is a C++ data analysis framework, providing advanced statistical methods needed by the HEP experiments for analysing their data. R is a free software framework for statistical computing, which complements the functionality of ROOT, by including some of the latest tools developed by statistics and computing research groups. We will present the ROOT-R package, a module in ROOT, which...
Geert Jan Besjes
(Radboud University Nijmegen (NL))
16/04/2015, 11:15
Track2: Offline software
oral presentation
We present a software framework for statistical data analysis, called *HistFitter*, that has been used extensively in the ATLAS Collaboration to analyze data of proton-proton collisions produced by the Large Hadron Collider at CERN. Most notably, HistFitter has become a de-facto standard in searches for supersymmetric particles since 2012, with some usage for Exotic and Higgs boson physics....
Dr
Lisa Gerhardt
(LBNL)
16/04/2015, 11:30
Track2: Offline software
oral presentation
SciDB is an open-source analytical database for scalable complex analytics on very large array or multi-structured data from a variety of sources, programmable from Python and R. It runs on HPC, commodity hardware grids, or in a cloud and can manage and analyze terabytes of array-structured data and do complex analytics in-database.
We present an overall description of the SciDB framework...
Markus Frank
(CERN)
16/04/2015, 11:45
Track2: Offline software
oral presentation
The detector description is an essential component that has to be used to analyse
and simulate data resulting from particle collisions in high energy physics experiments.
Based on the DD4hep detector description toolkit a flexible and data driven simulation
framework was designed using the Geant4 tool-kit. We present this framework and describe
the guiding requirements and the...
Ms
Xiaofeng LEI
(INSTITUE OF HIGH ENERGY PHYSICS, University of Chinese Academy of Sciences)
16/04/2015, 12:00
Track2: Offline software
oral presentation
In the past years, we have successfully applied Hadoop to high-energy physics analysis. Although, we have not only improved the efficiency of data analysis, but also reduced the cost of cluster building so far, there are still some spaces to be optimized, like static pre-selection, low-efficient random data reading and I/O bottleneck caused by Fuse which is used to access HDFS. In order to...
Robert Kutschke
(Femilab)
16/04/2015, 12:15
Track2: Offline software
oral presentation
The art event processing framework is used by almost all new experiments at Fermilab, and by several outside of Fermilab. All use art as an external product in the same sense that the compiler, ROOT, Geant4, CLHEP and boost are external products. The art team has embarked on a campaign to document art and develop training materials for new users. Many new users of art have little or no...
Previous tabNext tab
Print
PDF
Full screen
Detailed view
Filter
14:00
15:00
16:00
17:00
18:00
Updating the timetable...