Speaker
Description
We present recent work in supporting deep learning for particle physics and cosmology at NERSC, the US Dept. of Energy mission HPC center. We describe infrastructure and software to support both large-scale distributed training across (CPU and GPU) HPC resources and for productive interfaces via Jupyter notebooks. We also detail plans for accelerated hardware for deep learning in the future HPC machines at NERSC, ‘Perlmutter’ and beyond. We demonstrate these capabilities with a characterisation of the emerging deep learning workload running at NERSC. We also present use of these resources to implement specific cutting-edge applications including conditional Generative Adversarial Networks for particle physics and dark-matter cosmology simulations and bayesian inference via probabilistic programming for LHC analyses.
Consider for promotion | No |
---|