Nov 4 – 8, 2019
Adelaide Convention Centre
Australia/Adelaide timezone

Running synchronous detector reconstruction in ALICE using declarative workflows

Nov 5, 2019, 4:30 PM
Riverbank R4 (Adelaide Convention Centre)

Riverbank R4

Adelaide Convention Centre

Oral Track X – Crossover sessions from online, offline and exascale Track X – Crossover sessions


Matthias Richter (University of Oslo (NO))


The ALICE experiment at the Large Hadron Collider (LHC) at CERN will deploy a combined online-offline facility for detector readout and reconstruction, as well as data compression. This system is designed to allow the inspection of all collisions at rates of 50 kHz in the case of Pb-Pb and 400 kHz for pp collisions in order to give access to rare physics signals. The input data rate of up to of 3.4 TByte/s requires that a large part of the detector reconstruction will be realized online in the synchronous stage of the system.

The data processing is based on individual algorithms which will be executed in parallel processes on multiple compute nodes. Data and workload will be distributed among the nodes and processes using message queue communication provided by the FairMQ package of the ALFA software framework. As the ALICE specific layer, a message-passing aware data model and annotation allows to efficiently describe data and routing. Finally, the Data Processing Layer introduces the description of the reconstruction in a data-flow oriented approach, and makes the complicated nature of a distributed system transparent to users and developers. So-called workflows are defined in a declarative language as sequences of processes with inputs, the algorithm, and outputs as the three descriptive properties.

With this layered structure of the ALICE software, development of specific stages of the reconstruction can be done in a flexible way in the domain of the specified processes without the need of boiler-plate adjustments and taking into account details of the distributed and parallel system. The Data Processing Layer framework takes care of generating the workflow with the required connections and synchronization, and interfaces to the backend deploying the workflow on computing resources. For the development it is completely transparent whether to run a workflow on a laptop or a computer cluster.

The modular software framework is the basis for splitting the data processing into manageable pieces and helps to distribute development effort. This contribution describes the implementation of reconstruction workflows in the ALICE software framework and the optimization of data objects, and presents results from the realized prototypes.

Consider for promotion No

Primary author

Matthias Richter (University of Oslo (NO))


Presentation materials