As part of the search for Beyond the Standard Model physics, an array of next generation particle, nuclear and astroparticle experiments are under construction by global collaborations worldwide. These include the High-Luminosity Large Hadron Collider (HL-LHC) at CERN, the Deep Underground Neutrino Experiment (DUNE) at Fermilab, the Electron Ion Collider (EIC) at Brookhaven National Laboratory, the Facility for Antiproton and Ion Research at GSI, and many others.
These experiments are massive data generators and the cutting edge data science challenges are significant. For example, the HL-LHC experiments are expected to produce exabytes of science data each year. Discoveries require analyzing these huge data volumes and understanding extremely complex instruments, with ever more sophisticated algorithms. The development of highly performant data analysis systems that reduce "time-to-insight" and maximize physics potential is crucial. This involves the continued innovation by existing community tools like ROOT, new cutting-edge data science tools, the development of dedicated analysis facilities, advanced machine learning and entirely new routes to explore, such as differentiable programming.
It has been five years since the first Analysis Ecosystems Workshop organised by the HSF in 2017. Since that time many changes have happened, with the advent of new projects, tools, and data formats, intense activity and progress in established projects. Still, the challenge of efficient analysis for the HL-LHC era is not yet solved and so the HSF and IRIS-HEP, together with IJCLab, are organising the Second Analysis Ecosystems Workshop.
Topics for the workshop will include, amongst others:
As a workshop, there will be limited presentations, lots of time for discussion and a written outcome that summarises the workshop’s conclusions and points the way forward.