The intelligent Data Delivery Service (iDDS) has been developed to cope with the huge increase of computing and storage resource usage in the coming Large Hadron Collider (LHC) data taking. It has been designed to intelligently orchestrate workflow and data management systems, decoupling data pre-processing, delivery, and main processing in various workflows. It is an experiment-agnostic service around a workflow-oriented structure with Directed Acyclic Graph (DAG) support to work with existing and emerging use cases in ATLAS and other experiments. Here we will present the motivation for iDDS, its design schema and architecture, use cases and current status for the ATLAS and Rubin Observatory exercise, and plans for the future.
|Are you are a member of the APS Division of Particles and Fields?||No|