Speaker
Description
One of the objectives of the EOSC (European Open Science Cloud) Future Project was to integrate diverse analysis workflows from Cosmology, Astrophysics and High Energy Physics in a common framework. This led to the inception of the Virtual Research Environment (VRE) at CERN, a prototype platform supporting the goals of Dark Matter and Extreme Universe Science Projects in compliance with FAIR (Findable, Accessible, Interoperable, Reusable) data policies. The goal of the project was to highlight the synergies between different dark matter communities and experiments, by producing new scientific results as well as by making the necessary data and software tools fully available. The VRE makes use of a common authentication and authorisation infrastructure (AAI), and shares the different experimental data (ATLAS, Fermi-LAT, CTA, Darkside, Km3Net, Virgo, LOFAR) in a reliable distributed storage infrastructure via the ESCAPE Data Lake. The entry point of such a platform (for an end-user) is a jupyterhub instance deployed on top of a scalable Kubernetes infrastructure, providing an interactive graphical interface for researchers to access, analyse and share data. The data access and browsability is enabled through API calls to the high level data management and storage orchestration software (Rucio). The VRE aims to streamline the development of end-to-end physics workflows, granting researchers access to an infrastructure that contains easy-to-use physics analysis workflow from different experiments. In this contribution, I will provide an overview of the VRE, highlight its use cases as an analyser for implementing and reproducing experimental analyses on a REANA cluster, and showcase the successful integration of an ATLAS experimental analysis workflow into the VRE platform.