Speaker
Description
Abstract The fields of High-Energy physics (HEP) and machine learning (ML) converge on the challenge of uncertainty-aware parameter estimation in the presence of data distribution distortions, described in their respective languages --- systematic uncertainties and domain shifts. We present a novel approach based on Contrastive Normalizing Flows (CNFs), which achieved top performance on the HiggsML Uncertainty Challenge. Building on the insight that a binary classifier can approximate the model parameter likelihood ratio, $\frac{P(x_i|\theta_1,)}{P(x_i|\theta_2,)}$ we address the practical limitations of expressivity and the high cost of simulating high-dimensional parameter grids—by embedding data and parameters in a learned CNF mapping. This mapping models a unique and tunable contrastive distribution that enables robust classification under shifted data distributions. Through a combination of theoretical analysis and empirical evaluations, we show that CNFs, when coupled with a classifier and proper statistics, provide principled parameter estimation and uncertainty quantification through robust classification.
Context This is the method paper for a top-performing solution to the Higgs Uncertainty Challenge (https://arxiv.org/abs/2410.02867). This will also be presented at the Fair Universe HiggsML Uncertainty CERN workshop.
Would you like to be considered for an oral presentation? | Yes |
---|