Speaker
Description
Uncertainty estimation is a crucial issue when considering the application of deep neural network to problems in high energy physics such as jet energy calibrations.
We introduce and benchmark a novel algorithm that quantifies uncertainties by Monte Carlo sampling from the models Gibbs posterior distribution. Unlike the established 'Bayes By Backpropagation' training regime, it does not rely on any approximations of the network weight posterior, is flexible to most training regimes, and can be applied after training to any network. For a one-dimensional regression task, we show that this novel algorithm describes epistemic uncertainties well, including large errors for extrapolation.