In the study of complex systems, evaluating physical observables often requires sampling representative configurations via Monte Carlo techniques. These methods rely on repeated evaluations of the system's energy and force fields, which can become computationally expensive. To accelerate these simulations, deep learning models are increasingly employed as surrogate functions to approximate the...
Neural Simulation-Based Inference (NSBI) is a powerful class of machine learning (ML)-based methods for statistical inference that naturally handle high dimensional parameter estimation without the need to bin data into low-dimensional summary histograms. Such methods are promising for a range of measurements at the Large Hadron Collider, where no single observable may be optimal to scan over...
I will review the parametrized classifiers for optimizing the sensitivity to EFT operators and some the machine-learning approaches for general anomaly detection. Particular attention will be devoted to validation procedures and ways to treat uncertainties.
I discuss how uncertainties related to machine learning modeling of a regression problem, as well as those related to missing theoretical information, can be estimated and subsequently validated. Even though these uncertainties are intrinsically Bayesian, given that there is only one underlying true theory and true model, they can be determined both in a Bayesian and frequentist framework. I...
The phenomena of Jet Quenching, a key signature of the Quark-Gluon Plasma (QGP) formed in Heavy-Ion (HI) collisions, provides a window of insight into the properties of the primordial liquid. In this study, we evaluate the discriminating power of Energy Flow Networks (EFNs), enhanced with substructure observables, in distinguishing between jets stemming from proton-proton (pp) and jets...
In this contribution I will review the use cases of uncertainty quantification with deep learning in high-energy astroparticle physics. Among other things, I will present the combination of neural networks with conditional normalizing flows to predict the Posterior for all quantities of interest. This Ansatz can be further expanded with the snowstorm method developed by the IceCube...
The interTwin project develops an open-source Digital Twin Engine to integrate application-specific Digital Twins (DTs) across scientific domains. Its framework for the development of DTs supports interoperability, performance, portability and accuracy. As part of this initiative, we implemented the CaloINN normalizing-flow model for calorimeter simulations within the interTwin framework....
Fast and precise evaluations of scattering amplitudes even in the case of precision calculations is essential for event generation tools at the HL-LHC. We explore the scaling behavior of the achievable precision of neural networks in this regression problem for multiple architectures, including a Lorentz symmetry aware multilayer perceptron and the L-GATr architecture. L-GATr is equivariant...
The Fair Universe project organised the HiggsML Uncertainty Challenge, which took place from 12th September 2024, to 14th March 2025. This groundbreaking competition in high-energy physics (HEP) and machine learning was the first to strongly emphasis on uncertainties, focusing on mastering both the uncertainties in the input training data and providing credible confidence intervals in the...
Causality, in Pearl’s framework, is defined through structural causal models: systems of structural equations with exogenous variables and a directed acyclic graph that encodes cause–effect relations. In contrast, correlation, which often forms the basis of artificial intelligence models, quantifies statistical association and may arise from confounding or indirect paths without implying a...
Rigorous statistical methods, including the estimation of parameter values and their uncertainties, underpins the validity of scientific discovery, and has been especially important in the natural sciences. In the age of data-driven modeling, where the complexity of data and statistical models grow exponentially as computing power increases, uncertainty quantification has become exceedingly...
Anomaly detection in multivariate time series is an important problem across various fields such as healthcare, financial services, manufacturing or physics detector monitoring. Accurately identifying the instances when defects occur is essential but challenging, as the types of anomalies are unknown beforehand and reliably labelled data are scarce.
We evaluate unsupervised transformer-based...
Biological synapses effortlessly balance memory retention and flexibility, yet artificial neural networks still struggle with the extremes of catastrophic forgetting and catastrophic remembering. Here, we introduce Metaplasticity from Synaptic Uncertainty (MESU), a Bayesian framework that updates network parameters according to their uncertainty. This approach allows a principled combination...
Correctly calibrated uncertainties have always been a fundamental pillar of particle physics. As machine learning becomes increasingly integrated into both experimental and theoretical workflows, it is essential that neural network predictions include robust and reliable uncertainty estimates.
This talk will review current approaches to uncertainty estimation in neural networks, focusing on...
Geometric learning pipelines have achieved state-of-the-art performance in High-Energy and Nuclear Physics reconstruction tasks like flavor tagging and particle tracking [1]. Starting from a point cloud of detector or particle-level measurements, a graph can be built where the measurements are nodes, and where the edges represent all possible physics relationships between the nodes. Depending...
Critical Heat Flux (CHF) represents a concern for the nuclear safety, as it leads to a rapid drop down in the heat transfer between a heated surface and the liquid coolant in the core of nuclear reactors. This could cause several issues to the system, including structural damage and release of radioactive material.
The main challenge related to CHF prediction is the highly non-linear...
Over the last decade machine learning has had tremendous impact on biological sequence data analysis. In this talk, I will begin by introducing general issues related to biological sequence modeling. I will then review a selection of recent works on this topic, including: i) generative models for sequence design, ii) sampling of evolutionary paths between natural sequences of different...