19–25 Oct 2024
Europe/Zurich timezone

Leveraging workflow engines and computing frameworks for physics analysis scalability and reproducibility

22 Oct 2024, 14:42
18m
Room 2.B (Conference Room)

Room 2.B (Conference Room)

Talk Track 8 - Collaboration, Reinterpretation, Outreach and Education Parallel (Track 8)

Speaker

Dr Mindaugas Sarpis (Vilnius University)

Description

With the onset of ever more data collected by the experiments at the LHC and the increasing complexity of the analysis workflows themselves, there is a need to ensure the scalability of a physics data analysis. Logical parts of an analysis should be well separated - the analysis should be modularized. Where possible, these different parts should be maintained and reused for other analyses or reinterpretation of the same analysis.
Also, having an analysis prepared in such a way helps to ensure its reproducibility and preservation in the context of good data and analysis code management practices following the FAIR principles. In this talk, a few different topics on analysis modularization are discussed. An analysis on searches for pentaquarks within the LHCb experiment at CERN is used as an example.

Primary author

Dr Mindaugas Sarpis (Vilnius University)

Presentation materials