Speaker
Description
The Fair Universe project is building a large-compute-scale AI ecosystem for sharing datasets, training large models and hosting challenges and benchmarks. Furthermore, the project is exploiting this ecosystem for an AI challenge series focused on minimizing the effects of systematic uncertainties in High-Energy Physics (HEP), and on predicting accurate confidence intervals. This talk will describe the challenge platform we have developed that builds on the open-source benchmark ecosystem Codabench to interface it to the NERSC HPC center and its Perlmutter system with over 7000 A100 GPUs.
This presentation will also tease the first of our Fair Universe public challenges hosted on this platform, the Fair Universe: HiggsML Uncertainty Challenge, which will apply to be a NeurIPS 2024 competition. Participants will be presented a large training dataset corresponding to H to tau tau cross section measurement at the Large Hadron Collider. They should design an analysis technique able to not just measure the signal strength but to provide a confidence interval, which correct coverage will be evaluated automatically from pseudo-experiments. The confidence interval should include statistical uncertainty and also systematic uncertainties (concerning detector calibration, background levels etc…). It is expected that advanced analysis techniques that are able to control the impact of systematics will perform best.
A hackathon that took place in Nov 2023 during the AI and the Uncertainty Challenge in Fundamental Physics Workshop in Paris ( see presentation and conclusion) has enabled us to validate the platform and the robustness of the ranking with a simplified prototype of the competition.
The Codabench/NERSC platform allows for hosting challenges also from other communities, and we also intend to make our benchmark designs available as templates so similar efforts can be easily launched in other domains.
Would you like to be considered for an oral presentation? | Yes |
---|