During 2016-2017 the worldwide HEP community met over a series of workshops to prepare a roadmap for the software R&D needed to prepare for the data and computational challenges of the High Luminosity LHC and other HEP experiments in the 2020s. This process was organized by the HEP Software Foundation and the outcome was a community white paper with title “A Roadmap for HEP Software and Computing R&D for the 2020s”.
The community is now proceeding with elements of that plan. In particular, the NSF in the US has funded the Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP). IRIS-HEP is intended to serve as an active center for software R&D, function as an intellectual hub for the larger community-wide software R&D efforts, and transform the operational services required to ensure the success of the HL-LHC scientific program. Three high-impact R&D areas will leverage the talents of the U.S. university community: (1) the development of innovative algorithms for data reconstruction and triggering, (2) the development of highly performant analysis systems that reduce `time-to-insight’ and maximize the HL-LHC physics potential and (3) the development of data organization, management and access systems for the community’s upcoming Exabyte era. IRIS-HEP will also sustain investments in distributed high-throughput computing (DHTC) for the LHC through the Open Science Grid and build an integration path (the Scalable Systems Laboratory) to deliver the output of its R&D activities into the distributed and scientific production infrastructures.
In this talk we will describe the current R&D activities of IRIS-HEP and the wider community to implement the roadmap. We will also discuss related activities to implement a vision for the training of young researchers in the computational and data science tools and techniques that are required to be a successful researcher today.