Conveners
Detector Simulation
- Claudius Krause (HEPHY Vienna (ÖAW))
Detector Simulation
- Sascha Diefenbacher (Lawrence Berkeley National Lab. (US))
Detector Simulation
- Timo Janssen
Monte Carlo (MC) simulations are crucial for collider experiments, enabling the comparison of experimental data with theoretical predictions. However, these simulations are computationally demanding, and future developments, like increased event rates, are expected to surpass available computational resources. Generative modeling can substantially cut computing costs by augmenting MC...
One potential roadblock towards the HL-LHC experiment, scheduled to begin in 2029, is the computational demand of traditional collision simulations. Projections suggest current methods will require millions of CPU-years annually, far exceeding existing computational capabilities. Replacing the event showers module in calorimeters with quantum-assisted deep learning surrogates can help bridge...
As data sets grow in size and complexity, simulated data play an increasingly important role in analysis. In many fields, two or more distinct simulation software applications are developed that trade off with each other in terms of accuracy and speed. The quality of insights extracted from the data stand to increase if the accuracy of faster, more economical simulation could be improved to...
The CMS Fast Simulation chain (FastSim) is roughly 10 times faster than the application based on the GEANT4 detector simulation and full reconstruction referred to as FullSim. This advantage however comes at the price of decreased accuracy in some of the final analysis observables. A machine learning-based technique to refine those observables has been developed and its status is presented...
Fast event and detector simulation in high-energy physics using generative models provides a viable solution for generating sufficient statistics within a constrained computational budget, particularly in preparation for the High Luminosity LHC. However, many of these applications suffer from a quality/speed tradeoff. Diffusion models offer some of the best sampling quality but slow generation...
Simulating particle physics data is an essential yet computationally intensive process in analyzing data from the LHC. Traditional fast simulation techniques often use a surrogate calorimeter model followed by a reconstruction algorithm to produce reconstructed objects. In this work, we introduce Particle-flow Neural Assisted Simulations (Parnassus), a deep learning-based method for generating...
Simulating showers of particles in highly-granular detectors is a key frontier in the application of machine learning to particle physics. Achieving high accuracy and speed with generative machine learning models can enable them to augment traditional simulations and alleviate a major computing constraint.
Recent developments have shown how diffusion based generative shower simulation...