Conveners
Session 3
- Rudiger Haake (CERN)
- Lorenzo Moneta (CERN)
-
Lorenzo Moneta (CERN), Markus Stoye (CERN), Paul Seyfert (CERN), Rudiger Haake (CERN), Steven Randolph Schramm (Universite de Geneve (CH))10/04/2018, 09:00
-
David Josef Schmidt (Rheinisch Westfaelische Tech. Hoch. (DE)), Thorben Quast (Rheinisch Westfaelische Tech. Hoch. (DE)), Jonas Glombitza (Rheinisch-Westfaelische Tech. Hoch. (DE))10/04/2018, 09:05
This is a merger of three individual contributions:
Go to contribution page
- https://indico.cern.ch/event/668017/contributions/2947026/
- https://indico.cern.ch/event/668017/contributions/2947027/
- https://indico.cern.ch/event/668017/contributions/2947028/ -
Kamil Rafal Deja (Warsaw University of Technology (PL))10/04/2018, 10:00
Simulating detector response for the Monte Carlo-generated
Go to contribution page
collisions is a key component of every high-energy physics experiment.
The methods used currently for this purpose provide high-fidelity re-
sults, but their precision comes at a price of high computational cost.
In this work, we present a proof-of-concept solution for simulating the
responses of detector clusters to particle... -
Michela Paganini (Yale University (US))10/04/2018, 11:00
In this contribution, we present a method for tuning perturbative parameters in Monte Carlo simulation using a classifier loss in high dimensions. We use an LSTM trained on the radiation pattern inside jets to learn the parameters of the final state shower in the Pythia Monte Carlo generator. This represents a step forward compared to unidimensional distributional template-matching methods.
Go to contribution page -
Egor Zakharov10/04/2018, 11:25
Fast calorimeter simulation in LHCb
In HEP experiments CPU resources required by MC simulations are constantly growing and become a very large fraction of the total computing power (greater than 75%). At the same time the pace of performance improvements from technology is slowing down, so the only solution is a more efficient use of resources. Efforts are ongoing in the LHC experiments to...
Go to contribution page -
Gul Rukh Khattak (University of Peshawar (PK))10/04/2018, 11:50
Machine Learning techniques have been used in different applications by the HEP community: in this talk, we discuss the case of detector simulation. The need for simulated events, expected in the future for LHC experiments and their High Luminosity upgrades, is increasing dramatically and requires new fast simulation solutions. We will describe an R&D activity, aimed at providing a...
Go to contribution page -
Thorben Quast (Rheinisch Westfaelische Tech. Hoch. (DE))
The increased instantaneous luminosity at HL-LHC will raise the computing requirements for event reconstruction and analysis for current LHC-based experiments, hence limiting the available resources for the simulation of particles traversing matter. Developments of the performance of state-of-the-art simulation frameworks such as Geant4 are proceeding but are unlikely to fully compensate for...
Go to contribution page -
David Josef Schmidt (Rheinisch Westfaelische Tech. Hoch. (DE))
Developing and building an analysis in high energy particle physics requires a large amount of simulated events. Simulations at the LHC are usually complex and computationally intensive due to sophisticated detector architectures. In this context, Generative Adversarial Networks (GANs) have recently caught a wide interest. GANs can learn to generate complex data distributions and produce...
Go to contribution page -
Mr Jonas Glombitza
Machine learning models, especially deep neural networks produce appropriate predictions when working on a test set similar to the training set. In physics research machine learning models are usually designed to be used for data application but trained on simulations. Therefore, differences between simulations and data can cause substantial uncertainties in the application.
Go to contribution page
Here we attempt to...