An intelligent Data Delivery Service (iDDS) for and beyond the ATLAS experiment

Jul 12, 2021, 4:30 PM
Track E (Zoom)

Track E


talk Computation, Machine Learning, and AI Computation, Machine Learning, and AI


Wen Guan (University of Wisconsin (US))


The intelligent Data Delivery Service (iDDS) has been developed to cope with the huge increase of computing and storage resource usage in the coming Large Hadron Collider (LHC) data taking. It has been designed to intelligently orchestrate workflow and data management systems, decoupling data pre-processing, delivery, and main processing in various workflows. It is an experiment-agnostic service around a workflow-oriented structure with Directed Acyclic Graph (DAG) support to work with existing and emerging use cases in ATLAS and other experiments. Here we will present the motivation for iDDS, its design schema and architecture, use cases and current status for the ATLAS and Rubin Observatory exercise, and plans for the future.

Are you are a member of the APS Division of Particles and Fields? No

Primary author

Wen Guan (University of Wisconsin (US))

Presentation materials