An IRIS-HEP Blueprint Workshop

A Coordinated Ecosystem for HL-LHC Computing R&D

US/Eastern
Catholic University of America

Catholic University of America

620 Michigan Ave NE, Washington, DC 20064
Brian Paul Bockelman (University of Nebraska Lincoln (US)), Gordon Watts (University of Washington (US)), Mark Neubauer (Univ. Illinois at Urbana Champaign (US)), Michael David Sokoloff (University of Cincinnati (US)), Paolo Calafiura (Lawrence Berkeley National Lab. (US)), Peter Elmer (Princeton University (US))
Description

The research and development efforts required to address the HEP challenges for the HL-LHC are daunting. The current LHC physics program is enabled by an elaborate software and computing ecosystem. The planned major hardware upgrades for the HL-LHC, and its planned physics program, will require significant evolution of this ecosystem. Major advances in software performance, adaptability, sustainability, workforce development and training that take full advantage of future data & compute platforms and leverage developments from outside of HEP will be required to succeed. A coherent R&D effort in software and computing is required to achieve the physics goals of that era.

The NSF Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP) was established in September 2018 to meet the software and computing challenges of the HL-LHC through R&D of software for acquiring, managing, processing and analyzing HL-LHC data. The Institute is now nearly fully staffed and approximately one year into its program of R&D and role as an intellectual hub for HEP software and computing. The time is right to discuss planning around the broad program of software R&D for HEP and more clearly define how IRIS-HEP fits into this program going forward into the HL-LHC era. The primary theme of this workshop is to explore, and establish to the largest extent possible, coherence and alignment within this broad program. This workshop aims to bring together representatives from the IRIS-HEP team, US funding agencies, software and computing management in the stakeholder experiments, national & international laboratories, leadership-class facilities (LCFs) & Centers and partner projects to make progress toward this goal. Specific questions which the workshop will address include:

  1. How does the ensemble of US Software R&D efforts fit together to implement the HL-LHC Software/Computing roadmap described in the Community White Paper and meet the challenges of the HL-LHC? Which areas are not covered by US R&D efforts?
  2. How do the US Software R&D efforts collaborate with each other and with international efforts? How do these efforts align with and leverage national exascale, national NSF OAC priorities and trends in the broader community?
  3. How should the US R&D efforts be structured and organize in order to impact planned updates (all in ~2021/2022) to the HSF Community White Paper, the software/computing part of the US Snowmass process and HL-LHC experiment-specific software/computing TDRs?

This workshop builds on the Mini-workshop on HL-LHC Software and Computing R&D held at the Catholic University of America in November, 2017, and brings together the U.S. stakeholders and projects which will be contributing to the R&D program to prepare the computing models and systems for the HL-LHC.

This event will take place at the Catholic University of America. See Venue for location details. 

This event is being organised in part by the S2I2-HEP Conceptualization project, including travel support for some participants. The S2I2-HEP project is supported by National Science Foundation grants OAC-1558216, OAC-1558219, and OAC-1558233.  The Institute for Research and Innovation in Software (IRIS-HEP) is also providing support for this workshop via National Science Foundation Cooperative Agreement OAC-1836650.

 

Contact: Mark Neubauer