November 29, 2021 to December 3, 2021
Virtual and IBS Science Culture Center, Daejeon, South Korea
Asia/Seoul timezone

neos: Physics analysis as a differentiable program

contribution ID 773
Not scheduled
20m
Windmill (Gather.Town)

Windmill

Gather.Town

Poster Track 2: Data Analysis - Algorithms and Tools Posters: Windmill

Speaker

Mr Nathan Daniel Simpson (Lund University (SE))

Description

The advent of deep learning has yielded powerful tools to automatically compute gradients of computations. This is because “training a neural network” equates to iteratively updating its parameters using gradient descent to find the minimum of a loss function. Deep learning is then a subset of a broader paradigm; a workflow with free parameters that is end-to-end optimisable, provided one can keep track of the gradients all the way through. This paradigm is known as differentiable programming.

This work introduces neos: an example implementation of a fully differentiable workflow that reframes physics analysis as a differentiable program. It’s capable of optimizing a learnable summary statistic with respect to a variety of loss functions, including expected discovery significance, CLs, uncertainty on parameters of interest, and pull widths. Doing this results in an optimisation process that is aware of how every step in the workflow changes your metric of choice, including the modelling and treatment of nuisance parameters.

Significance

Systematic-aware approaches to learning in HEP have been sought after for a while, e.g. https://arxiv.org/abs/1806.04743, https://arxiv.org/abs/2105.08742, but this is the first time the a pipeline has been fully end-to-end optimizable. Moreover, this work reframes the problem in a more general way, and discusses the interchangable building blocks that encompass both this and some past approaches, prompting a shift to a more flexible paradigm.

References

PyHEP 2020: https://www.youtube.com/watch?v=3P4ZDkbleKs

MODE workshop on differentiable programming: https://indico.cern.ch/event/1022938/contributions/4487419/

ATLAS-internal ML workshop: https://indico.cern.ch/event/1057472/contributions/4470766/attachments/2289731/3892673/AML_forum_July_2021_uncertanties_Nathan_Simpson_differentiable_analysis.pdf

Citeable code: https://github.com/gradhep/neos

Speaker time zone Compatible with Europe

Primary author

Mr Nathan Daniel Simpson (Lund University (SE))

Co-author

Lukas Alexander Heinrich (New York University (US))

Presentation materials