Large Scale Monte Carlo Simulation of neutrino interactions using the Open Science Grid and Commercial Clouds

Not scheduled


1919-1 Tancha, Onna-son, Kunigami-gun Okinawa, Japan 904-0495
poster presentation Track5: Computing activities and Computing models


Andrew Norman (Fermilab)


Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 300 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. This paper details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the routing of different simulation and data processing tasks to these resources. We discuss the optimization of the simulation computing model, data movement and computation to match the commercial computing environment.

Primary author

Andrew Norman (Fermilab)


Eric Flumerfelt (Fermilab) Gavin Davies (Iowa State University) Matthew Tamsett (University of Sussex) Nathan Mayer (Tufts University) Steven Timm (Fermilab)

Presentation materials