We provide an overview of two ongoing projects that aim to ensure the availability of fast and user-friendly solutions for physics analysis pipelines towards the HL-LHC. The Analysis Grand Challenge (AGC) defines an analysis task that captures relevant physics analysis workflow aspects. A variety of implementations have been developed for this task, allowing to probe user experience and...
This talk will give a broad overview on the fitting that we're doing in
HEP. On one hand, the talk will cover the variety of fits in HEP, the
different needs and types of inference as well as efforts for
serialization and standardization. On the other hand, the relevant
libraries will be covered, that is zfit, pyhf, hepstats iminuit and
Python packages like SciPy and how they work...
I'd like to present evermore
(https://github.com/pfackeldey/evermore) that focusses in efficiently building and evaluating likelihoods typically for HEP. Currently, it focusses on binned template fits.
It supports autodiff, JIT-compilation and vectorization of full fits (even on GPUs).
Workflow managers help structure the code of pipelined jobs by defining and managing dependencies between tasks in a clear and easy-to-understand fashion. This abstraction allows independent tasks to be automatically parallelised more independently of computing systems. Additionally, workflow managers help keep track of different tasks’ outputs and inputs.
b2luigi is an extension of the...
Physicists performing data analyses are usually required to steer their individual, complex workflows manually, frequently involving job submission in several stages and interaction with distributed storage systems by hand. This process is not only time-consuming and error-prone, but also leads to undocumented relations between particular workloads, rendering the steering of an analysis a...
Workflows for research in HEP experiments are not only quite complex but also require sufficient flexibility to adapt to changes in structuring, conditions, methodologies, and research interests. This holds especially true in the physics analyses extracting the results and measurements.
Here, the use of workflows systems, specifically Luigi, have shown to be of great use to manage and...
Offloading resource intensive tasks, i.e.:
- histograms (accumulation) - memory intensive
- DL algorithms - compute intensive
Let’s discuss the exciting world of combining Python and Julia for data analysis for high-energy physics (HEP) and other data-intensive fields.
We'll kick things off with a quick overview of why Python is so popular for data analysis and introduce Julia, which is making waves with its incredible performance and suitability for scientific computing.
Next, I'll show you how we can get the...
PocketCoffea is a python columnar analysis framework based on coffea for CMS NanoAOD events. It provides a workflow for HEP analyses using a combination of customizable abstractions and configuration files. The package features dataset query automatisation, jet calibration, data processing, histogramming and plotting. PocketCoffea also provides support for code execution on various remote...