-
Dr Giulio Palombo (University of Milan - Bicocca)05/11/2008, 14:002. Data AnalysisParallel TalkDatasets in modern High Energy Physics (HEP) experiments are often described by dozens or even hundreds of input variables (features). Reducing a full feature set to a subset that most completely represents information about data is therefore an important task in analysis of HEP data. We compare various feature selection algorithms for supervised learning using several datasets such as,...Go to contribution page
-
Dr Marcin Wolter (Henryk Niewodniczanski Institute of Nuclear Physics PAN)05/11/2008, 14:252. Data AnalysisParallel TalkTau leptons will play an important role in the physics program at the LHC. They will not only be used in electroweak measurements and in detector related studies like the determination of the E_T^miss scale, but also in searches for new phenomena like the Higgs boson or Supersymmetry. Due to the overwhelming background from QCD processes, highly efficient algorithms are essential to...Go to contribution page
-
Dr Fabrizio Furano (Conseil Europeen Recherche Nucl. (CERN))05/11/2008, 14:502. Data AnalysisParallel TalkIn this talk we address the way the ALICE Offline Computing is starting to exploit the possibilities given by the Scalla/Xrootd repository globalization tools. These tools are quite general and can be adapted to many situations, without disrupting existing designs, but adding a level of coordination among xrootd-based storage clusters, and the ability to interact between them.Go to contribution page
-
Dr Jerzy Nogiec (FERMI NATIONAL ACCELERATOR LABORATORY)05/11/2008, 15:152. Data AnalysisParallel TalkAccelerator R&D environments produce data characterized by different levels of organization. Whereas some systems produce repetitively predictable and standardized structured data, others may produce data of unknown or changing structure. In addition, structured data, typically sets of numeric values, are frequently logically connected with unstructured content (e.g., images, graphs,...Go to contribution page
-
Dr Alfio Lazzaro (Universita' degli Studi and INFN, Milano)05/11/2008, 16:102. Data AnalysisParallel TalkMINUIT is the most common package used in high energy physics for numerical minimization of multi-dimensional functions. The major algorithm of this package, MIGRAD, searches for the minimum by using the function gradient. For each minimization iteration, MIGRAD requires the calculation of the first derivatives for each parameter of the function to be minimized. In this presentation we will...Go to contribution page
-
Dr Biglietti Michela (UNIVERSITY OF NAPOLI and INFN)05/11/2008, 16:352. Data AnalysisParallel TalkThe ATLAS trigger system is designed to select rare physics processes of interest from an extremely high rate of proton-proton collisions, reducing the LHC incoming rate of about 10^7. The short LHC bunch crossing period of 25 ns and the large background of soft-scattering events overlapped in each bunch crossing pose serious challenges, both on hardware and software, that the ATLAS trigger...Go to contribution page
-
Mr Danilo Enoque Ferreira De Lima (Federal University of Rio de Janeiro (UFRJ) - COPPE/Poli)05/11/2008, 17:002. Data AnalysisParallel Talk
The ATLAS trigger system is responsible for selecting the interesting collision events delivered by the Large Hadron Collider(LHC). The ATLAS trigger will need to achieve a ~10‐7 rejection factor against random proton‐proton collisions, and still be able to efficiently select interesting events. After a first processing level based on FPGAs and ASICS, the final event selection is based on...
Go to contribution page -
Dr Markward Britsch (Max-Planck-Institut fuer Kernphysik (MPI)-Unknown-Unknown)05/11/2008, 17:252. Data AnalysisParallel TalkA large hadron machine like the LHC with its high track multiplicities always asks for powerful tools that drastically reduce the large background while selecting signal events efficiently. Actually such tools are widely needed and used in all parts of particle physics. Regarding the huge amount of data that will be produced at the LHC, the process of training as well as the process of...Go to contribution page
-
Liliana Teodorescu (Brunel University)05/11/2008, 17:502. Data AnalysisParallel TalkIn order to address the data analysis challenges imposed by the complexity of the data generated by the current and future particle physics experiments, new techniques for performing various analysis tasks need to be investigated. In 2006 we introduced to the particle physics field one such new technique, based on Gene Expression Programming (GEP), and successfully applied it to an event...Go to contribution page
Choose timezone
Your profile timezone: