▾    Graeme Stewart - HSF
        ▾    Points from the talk
            •    HSF created to face the computing & software challenges for HL-LHC in light of needs ramping up faster than technology
            •    Idea: bottom up effort, do-ocracy
            ▾    Foundation (no profit/funding only from host universities in terms of members time)
                •    Does not have funding / commercialize software, but helps members apply for funding (e.g. IRIS-HEP)
            •    Has grown from roadmap whitepaper to a more organized working group structure
        ▾    Points from Q&A
            •    Computing is also an interest of the HSF, we have joint yearly meetings with the WLCG and are in touch regularly
            •    Good synergies with nuclear physics (ALICE, FAIR), disussions on data management (DOMA) in WLCG
            •    All software from collaborations is open source, licensing is also important
            ▾    Encouraging a modular approach to software, so it is not all ready at once
                •    Projects that follow that idea and are rather mature: Rucio, DD4hep, ACTS [add links]
                •    Projects that could be useful: EM shower generation
            ▾    HSF does not mean to write code for the experiments, but aims to be a great help
                •    e.g. by making sure experiments can find and take off-the-shelf pieces that are useful for their software
            •    Work on approximate statistical methods in event generation may be happening in some groups, not aware of official efforts yet though
            •    Role of ML: significant contributions in analysis (including object identification) - simulation and reconstruction are harder
    ▾    Carlos Munoz Camacho - AGATA and EIC
        ▾    Points from the talk
            •    Experiments covered: AGATA, EIC
            •    Some unique challenges, in terms of reconstruction and physics space
            ▾    (Actual) real-time analysis, using streaming readout
                •    Difference wrt LHC real-time analysis: timescales, LHC is a few hours and the data is stored in a buffer for as long as it’s necessary for the calibration, here one wants to avoid writing raw data altogether
                •    A key challenge of real time analysis is fast calibration / self-calibration
            ▾    Rates: like LHCb, so not immense, and can think outside the box
                •    Exascale computing brings accelerators and that's a real challenge for software writers
            •    There is an EIC software group, HSF is in contact with them
        ▾    Points from Q&A
            •    Do you share code with lattice QCD calculations (main consumers of HPC)
    ▾    Giovanni Lamanna - ESCAPE
        ▾    Points from discussion
            •    Data lake being co-developed with HEP
            •    Idea of a virtual research environment, with connections to HSF
            •    Economy of scale: choose to use existing building blocks, rather than rederive everything
            ▾    Since there will be many more communities, propose a modular workflow (with containers) that different collaborations can adapt to
                •    Build those example workflows around science cases
                •    Researchers become software writers, not only data users
                •    Teams are more diverse, including computing scientists, physicists and data scientists (more professionals needed)
            •    European Open Science Cloud infrastructure can provide funding, including for people
            ▾    Investment in training is very important as students have diverse backgrounds (computer science, physics, data science)
                •    HSF could contribute to the ASTERICS school in LAPP by extending to HEP as well
                •    This would be a way to start collaborating straight away, HSF will email responsible from ESCAPE
    ▾    Chris Tunnell - Experience from Direct Detection community
        ▾    Points from discussion
            •    DD is a large but more heterogeneous community wrt HEP
            ▾    Nevertheless, bottom-up effort has started, community-building stage
                •    First identify the needs, then the solutions
            •    Idea of short-term, limited-scope inter-collaboration efforts - could extend beyond DD?
            •    DANCE workshop @ RICE: https://dance.rice.edu 
    ▾    Paschal Coyle - KM3NeT/ANTARES
        ▾    Points from the talk
            Data is quite managable
            Simulation is very expensive, especially with large detectors (ML, GPU interesting?)
            Accelerated photon transport - JUNO experiment simulations
            ▾    Event-like data, like HEP
                •    Machine learning is having a big impact
            ▾    Data still manageable, total of 995 TB * 3 building blocks
                •    Possibly the time to start thinking about archiving and opening data
            •    Starting to try DIRAC for MC generation / data analysis, thinking about using Grid 
        ▾    Points from Q&A
            ▾    Challenge: large detector —> big simulation overhead
                •    Same as CTA, only a few days of data taking
                •    This is an issue for machine learning, as training data is limited
                •    Could look into accelerated photon transport from JUNO (plenary at CHEP)
            ▾    Plugged into a real-time alert system sensitive to supernovas
                •    Even in that case, event rate does not go up significantly
    ▾    Final discussion and outcomes
        ▾    Strengthen links between HSF and ESCAPE
            •    Opportunity for further funding through this cluster, also for recruitment
            •    Could raining (some urgency on this - school is in June)
        ▾    Software catalogs
            •    Many communities would benefit from a classification of useful/supported/documented software e.g. on peak finders, filtering, compression
        ▾    Trying a physics case: dark matter? 
            •    The know-how exists (and it is actively building up in direct detection), we could use dark matter searches as a prototype 
        ▾    What next
            •    Advertise workshops interesting
            •    Possibly put in an Expression Of Interest to APPEC-NuPECC-ECFA for support towards continuing this discussion