The worldwide particle physics community is currently planning upgrades to the Large Hadron Collider (LHC) at CERN in Geneva. The LHC today already uses a worldwide distributed computing grid to meet the needs of thousands of scientists to process and analyze some of the world's largest scientific datasets. The upgrades being planned will increase data volumes by more than two orders of magnitude and require significantly more complex data and analysis techniques.
This 2nd S2I2 HEP/CS workshop aims to bring together a diverse set of attendees from the high energy physics (HEP) and computer science (CS) communities to understand how the two communities could work together in the context of a future NSF Software Institute aimed at supporting particle physics research over the long term. We will build on the discussions which took place at the the first S2I2 HEP/CS workshop and take a fresh look at planned HEP and computer science research and brainstorm about engaging specific areas of effort, perspectives, synergies and expertise of mutual benefit to HEP and CS communities, especially as it relates to a future NSF Software Institute for HEP.
Discussions and sessions include Science Practices & Policies, Sociology and Community Issues, Machine Learning, Software Life Cycle / Software Engineering / Software/Data/Workflow Preservation & Reproducibility, Scalable Platforms, Data Management, Access, Distribution, Organization, Data Intensive Analysis Tools and Techniques, Visualization, Data Streaming and Training, Education, Professional Development, Advancement.
The meeting rooms at Princeton are:
Useful links:
Sponsors
This event is being organised in part by the S2I2-HEP Conceptualization project, including travel support for some participants. The S2I2-HEP project is supported by National Science Foundation grants ACI-1558216, ACI-1558219, and ACI-1558233. |