Conveners
Plenary: I
- Maria Girone (CERN)
- Lucia Silvestris (Universita e INFN, Bari (IT))
Plenary: II
- Jerome LAURET (Brookhaven National Laboratory)
- Maria Girone (CERN)
Plenary
- Lucia Silvestris (Universita e INFN, Bari (IT))
- David Britton (University of Glasgow (GB))
Plenary
- Domenico Elia (INFN Bari)
- Michael Poat
Plenary
- Monique Werlen (EPFL - Ecole Polytechnique Federale Lausanne (CH))
- Monique Werlen (EPFL - Ecole Polytechnique Federale Lausanne (CH))
- Leonardo Cosmai
Plenary
- Fons Rademakers (CERN)
- Lucia Silvestris (Universita e INFN, Bari (IT))
Plenary
- Daniel Maitre (University of Durham (GB))
- Domenico Elia (INFN Bari)
Plenary
- Sophie Berkman
- Domenico Elia (INFN Bari)
Plenary
- Lucia Silvestris (Universita e INFN, Bari (IT))
- Axel Naumann (CERN)
Transport phenomena remains nowadays the most challenging unsolved problems in computational physics due to the inherent nature of Navier-Stokes equations. As the revolutionary technology, quantum computing opens a grand new perspective for numerical simulations for instance the computational fluid dynamics (CFD). In this plenary talk, starting with an overview of quantum computing including...
The talk provides a short overview of QT history leading up to current times. Lets have a hard look at where we are in terms of QT and what major pitfalls to expect. The presentation will focus particularly on the issue of the growing talent gap.
Simulation in High Energy Physics (HEP) places a heavy burden on the available computing resources and is expected to become a major bottleneck for the upcoming high luminosity phase of the LHC and for future Higgs factories, motivating a concerted effort to develop computationally efficient solutions. Methods based on generative machine learning methods hold promise to alleviate the...
A bright future awaits particle physics. The LHC Run 3 just started, characterised by the most energetic beams ever created by humankind and the most sophisticated detectors. In the next few years we will accomplish the most precise measurements to challenge our present understanding of nature that will, potentially, lead us to prestigious discoveries. However, Run 3 is just the beginning. A...
Agent-based modeling is a versatile methodology to model complex systems and gain insights into fields as diverse as biology, sociology, economics, finance, and more. However, existing simulation platforms do not always take full advantage of modern hardware and therefore limit the size and complexity of the models that can be simulated.
This talk presents the BioDynaMo platform designed to...
As the search for new fundamental phenomena at modern particle colliders is a complex and multifaceted task dealing with high-dimensional data, it is not surprising that machine learning based techniques are quickly becoming a widely used tool for many aspects of searches. On the one hand, classical strategies are being supercharged by ever more sophisticated tagging algorithms; on the other...
Today, we live in a data-driven society. For decades, we wanted fast storage devices that can quickly deliver data, and storage technologies evolved to meet this requirement. As data-driven decision making becomes an integral part of enterprises, we are increasingly faced with a new need-–one for cheap, long-term storage devices that can safely store the data we generate for tens or hundreds...
Simulated event samples from Monte-Carlo event generators (MCEGs) are a backbone of the LHC physics programme.
However, for Run III, and in particular for the HL-LHC era, computing budgets are becoming increasingly constrained, while at the same time the push to higher accuracies
is making event generation significantly more expensive.
Modern ML techniques can help with the effort of...
“Computation” has become a massive part of our daily lives; even more so, in science, a lot of experiments and analysis rely on massive computation. Under the assumption that computation is cheap, and time-to-result is the only relevant metric for all of us, we currently use computational resources at record-low efficiency.
In this talk, I argue this approach is an unacceptable waste of...
The expected volume of data from the new generation of scientific facilities such as the Square Kilometre Array (SKA) radio telescope has motivated the expanded use of semi-automatic and automatic machine learning algorithms for scientific discovery in astronomy. In this field, the robust and systematic use of machine learning faces a number of specific challenges, including both a lack of...
Strategies to detect data departures from a given reference model, with no prior bias on the nature of the new physical model responsible for the discrepancy might play a vital role in experimental programs where, like at the LHC, increasingly rich experimental data are accompanied by an increasingly blurred theoretical guidance in their interpretation. I will describe one such strategy that...
AI is making an enormous impact on scientific discovery. Growing volumes of data across scientific domains are enabling the use of machine learning at ever increasing scale to accelerate discovery. Examples include using knowledge extraction and reasoning over large repositories of scientific publications to quickly study scientific questions or even come up with new questions, applying AI...
The production, validation and revision of data analysis applications is an iterative process that occupies a large fraction of a researcher's time-to-publication.
Providing interfaces that are simpler to use correctly and more performant out-of-the-box not only reduces the community's average time-to-insight but it also unlocks completely novel approaches that were previously impractically...
Precision simulations for collider phenomenology require intensive evaluations of complicated scattering amplitudes. Uncovering hidden simplicity in these basic building blocks of quantum field theory can lead us to new, efficient methods to obtain the necessary theoretical predictions. In this talk I will explore some new approaches to multi-scale loop amplitudes that can overcome...
I will discuss fundamental particle physics intersections with quantum science and technology including embedding challenging problems on quantum computation architectures
See https://indico.cern.ch/event/1106990/contributions/4998162/
See https://indico.cern.ch/event/1106990/contributions/5097014/
See https://indico.cern.ch/event/1106990/contributions/4991353/
The Japanese flagship supercomputer Fugaku started its operation in early 2021.
After one and half years of production runs it is producing some initial results in Lattice QCD applications, such as thermodynamics, heavy and light quark flavor physics, and hadron structures and interactions.
In this talk, we first touch on the basis of Fugaku and its software status.
Discussion is given on...
Over the last decade the C++ programming language has evolved significantly into safer, easier to learn and better supported by tools general purpose programming language capable of extracting the last bit of performance from bare metal. The emergence of technologies such as LLVM and Clang have advanced tooling support for C++ and its ecosystem grew qualitatively. C++ has an important role in...