Conveners
Track 6 – Physics Analysis: Analysis Tools & Methods
- Ross Young (University of Adelaide)
Track 6 – Physics Analysis: New Physics studies
- Maurizio Pierini (CERN)
Track 6 – Physics Analysis: Framework
- Maurizio Pierini (CERN)
Track 6 – Physics Analysis: Lattice QCD
- Phiala Shanahan (Massachusetts Institute of Technology)
Track 6 – Physics Analysis: Machine Learning for data analysis
- Maurizio Pierini (CERN)
Track 6 – Physics Analysis: Pheno fits / Analysis preservation
- Martin White (University of Adelaide (AU))
Track 6 – Physics Analysis: Event Reconstruction and Selection methods
- Maurizio Pierini (CERN)
Statistical modelling is a key element for High-Energy Physics (HEP) analysis. Currently, most of this modelling is performed with the ROOT/RooFit toolkit which is written in C++ and provides Python bindings which are only loosely integrated into the scientific Python ecosystem. We present zfit, a new alternative to RooFit, written in pure Python. Built on top of TensorFlow (a modern, high...
ROOT provides, through TMVA, machine learning tools for data analysis at HEP experiments and beyond. In this talk, we present recently included features in TMVA and the strategy for future developments in the diversified machine learning landscape. Focus is put on fast machine learning inference, which enables analysts to deploy their machine learning models rapidly on large scale datasets....
SWAN (Service for Web-based ANalysis) is a CERN service that allows users to perform interactive data analysis in the cloud, in a "software as a service" model. The service is a result of the collaboration between IT Storage and Databases groups and EP-SFT group at CERN. SWAN is built upon the widely-used Jupyter notebooks, allowing users to write - and run - their data analysis using only a...
The Faster Analysis Software Taskforce (FAST) is a small, European group of HEP researchers that have been investigating and developing modern software approaches to improve HEP analyses. We present here an overview of the key product of this effort: a set of packages that allows a complete implementation of an analysis using almost exclusively YAML files. Serving as an analysis description...
In physics we often encounter high-dimensional data, in the form of multivariate measurements or of models with multiple free parameters. The information encoded is increasingly explored using machine learning, but is not typically explored visually. The barrier tends to be visualising beyond 3D, but systematic approaches for this exist in the statistics literature. I will use examples from...
RooFit is the statistical modeling and fitting package used in many experiments to extract physical parameters from reduced particle collision data. RooFit aims to separate particle physics model building and fitting (the users' goals) from their technical implementation and optimization in the back-end. In this talk, we outline our efforts to further optimize the back-end by automatically...
In this talk I will present an investigation into sizeable interference effects between a {heavy} charged Higgs boson signal produced via $gg\to t\bar b H^-$ (+ c.c.) followed by the decay $H^-\to b\bar t$ (+ c.c.) and the irreducible background given by $gg\to t\bar t b \bar b$ topologies at the Large Hadron Collider (LHC). I will show how such effects could spoil current $H^\pm$...
GAMBIT is a modular and flexible framework for performing global fits to a wide range of theories for new physics. It includes theory and analysis calculations for direct production of new particles at the LHC, flavour physics, dark matter experiments, cosmology and precision tests, as well as an extensive library of advanced parameter-sampling algorithms. I will present the GAMBIT software...
Despite the overwhelming cosmological evidence for the existence of dark matter, and the considerable effort of the scientific community over decades, there is no evidence for dark matter in terrestrial experiments.
The GPS.DM observatory uses the existing GPS constellation as a 50,000 km-aperture sensor array, analysing the satellite and terrestrial atomic clock data for exotic physics...
In this talk, we discuss the new physics implication in Two Higgs doublet Model (2HDM) under various experimental constraints. As part work of Gambit group, our work is to use the global fit method to constrain the parameter space, find out the hints for new physics and try to make some predictions for further studies.
In our global fit, we include the constraints from LEP, LHC (SM-like...
At the high luminosity flavor factory experiments such as the Belle II
experiment, it is expected to find the new physics effect and
constrain the new physics models with the high statics and many
observables. In such analysis, the global analysis of the many
observables with the model-independent approach is important. One
difficulty in such global analysis is that the new physics...
Searches for beyond-Standard Model physics at the LHC have thus far not uncovered any evidence of new particles, and this is often used to state that new particles with low mass are now excluded. Using the example of the supersymmetric partners of the electroweak sector of the Standard Model, I will present recent results from the GAMBIT collaboration that show that there is plenty of room for...
The increase in luminosity by a factor of 100 for the HL-LHC with respect to Run 1 poses a big challenge from the data analysis point of view. It demands a comparable improvement in software and processing infrastructure. The use of GPU enhanced supercomputers will increase the amount of computer power and analysis languages will have to be adapted to integrate them. The particle physics...
With an increased dataset obtained during CERN LHC Run-2, the even larger forthcoming Run-3 data and more than an order of magnitude expected increase for HL-LHC, the ATLAS experiment is reaching the limits of the current data production model in terms of disk storage resources. The anticipated availability of an improved fast simulation will enable ATLAS to produce significantly larger Monte...
For the last 5 years Accelogic pioneered and perfected a radically new theory of numerical computing codenamed “Compressive Computing”, which has an extremely profound impact on real-world computer science. At the core of this new theory is the discovery of one of its fundamental theorems which states that, under very general conditions, the vast majority (typically between 70% and 80%) of the...
ALICE Experiment is currently undergoing a major upgrade program, both in
terms of hardware and software, to prepare for the LHC Run 3. A new Software
Framework is being developed in collaboration with the FAIR experiments at GSI
to cope with the 100 fold increase in collected collisions.
We present our progress to adapt such a framework for the end user physics data
analysis. In...
Scikit-HEP is a community-driven and community-oriented project with the goal of providing an ecosystem for particle physics data analysis in Python. Scikit-HEP is a toolset of approximately twenty packages and a few “affiliated” packages. It expands the typical Python data analysis tools for particle physicists. Each package focuses on a particular topic, and interacts with other packages in...
The COFFEA Framework provides a new approach to HEP analysis, via columnar operations, that improves time-to-insight, scalability, portability, and reproducibility of analysis. It is implemented with the Python programming language and commodity big data technologies such as Apache Spark and NoSQL databases. To achieve this suite of improvements across many use cases, COFFEA takes a factorized...
GUM is a new feature of the GAMBIT global fitting software framework, which provides a direct interface between Lagrangian level tools and GAMBIT. GUM automatically writes GAMBIT routines to compute observables and likelihoods for physics beyond the Standard Model. I will describe the structure of GUM, the tools (within GAMBIT) it is able to create interfaces to, and the observables it is able...
I will discuss recent advances in lattice QCD from the physics and computational points of view that have enabled basic a number properties and interactions of light nuclei to be determined directly from QCD. These calculations offer the prospect of providing nuclear matrix inputs necessary for a range of intensity frontier experiments (DUNE, mu2e) and dark matter direct-detection experiments...
Background field methods offer an approach through which fundamental non-perturbative hadronic properties can be studied. Lattice QCD is the only ab initio method with which Quantum Chromodynamics can be studied at low energies; it involves numerically calculating expectation values in the path integral formalism. This requires substantial investment in high performance super computing...
There exists a long standing discrepancy of around 3.5 sigma between experimental measurements and standard model calculations of the magnetic moment of the muon. Current experiments aim to reduce the experimental uncertainty by a factor of 4, and Standard Model calculations must also be improved by a similar order. The largest uncertainty in the Standard Model calculation comes from the QCD...
Computing the gluon component of momentum in the nucleon is a difficult and computationally expensive problem, as the matrix element involves a quark-line-disconnected gluon operator which suffers from ultra-violet fluctuations. But also necessary for a successful determination is the non-perturbative renormalisation of this operator. We investigate this renormalisation here by direct...
The origin of the low-lying nature of the Roper resonance has been the subject of significant interest for many years, including several investigations using lattice QCD. It has been claimed that chiral symmetry plays an important role in our understanding of this resonance. We present results from our systematic examination of the potential role of chiral symmetry in the low-lying nucleon...
Recent searches for supersymmetric particles at the Large Hadron Collider have been unsuccessful in detecting any BSM physics. This is partially because the exact masses of supersymmetric particles are not known, and as such, searching for them is very difficult. The method broadly used in searching for new physics requires one to optimise on the signal being searched for, potentially...
Charged particle tracking represents the largest consumer of CPU resources in high data volume Nuclear Physics experiments. An effort is underway to develop ML networks that will reduce the resources required for charged particle tracking. Tracking in NP experiments represent some unique challenges compared to HEP. In particular, track finding typically represents only a small fraction of the...
For physics analyses with identical final state objects, e.g. jets, the correct sorting of the objects at the input of the analysis can lead to a considerable performance increase.
We present a new approach in which a sorting network is placed upstream of a classification network. The sorting network combines the whole event information and explicitly pre-sorts the inputs of the analysis....
We present preliminary studies of a deep neural network (DNN) "tagger" that is trained to identify the presence of displaced jets arising from the decays of new long-lived particle (LLP) states in data recorded by the CMS detector at the CERN LHC. Particle-level candidates, as well as secondary vertex information, are refined through the use of convolutional neural networks (CNNs) before being...
Deep neural networks (DNNs) have been applied to the fields of computer vision and natural language processing with great success in recent years. The success of these applications has hinged on the development of specialized DNN architectures that take advantage of specific characteristics of the problem to be solved, namely convolutional neural networks for computer vision and recurrent...
We describe a multi-disciplinary project to use machine learning techniques based on neural networks (NNs) to construct a Monte Carlo event generator for lepton-hadron collisions that is agnostic of theoretical assumptions about the microscopic nature of particle reactions. The generator, referred to as ETHER (Empirically Trained Hadronic Event Regenerator), is trained to experimental data...
An important part of the LHC legacy will be precise limits on indirect effects of new physics, framed for instance in terms of an effective field theory. These measurements often involve many theory parameters and observables, which makes them challenging for traditional analysis methods. We discuss the underlying problem of “likelihood-free” inference and present powerful new analysis...
Extracting information about the quark and gluon (or parton) structure of the nucleon from high-energy scattering data is a classic example of the inverse problem: the experimental cross sections are given by convolutions of the parton probability distributions with process-dependent hard coefficients that are perturbatively calculable from QCD. While most analyses in the past have been based...
Phase transitions played an important role in the very early evolution of the Universe. We present a C++ software package (PhaseTracer) for finding cosmological phases and calculating transition properties involving single or multiple scalar fields. The package first maps the phase structure by tracing the vacuum expectation value (VEV) of the potential at different temperatures, then finds...
The CERN analysis preservation portal (CAP) comprises a set of tools and services aiming to assist researchers in describing and preserving all the components of a physics analysis such as data, software and computing environment. Together with the associated documentation, all these assets are kept in one place so that the analysis can be fully or partially reused even several years after the...
Likelihoods associated with statistical fits in searches for new physics are beginning to be published by LHC experiments on HEPData [arXiv:1704.05473]. The first of these is the search for bottom-squark pair production by ATLAS [ATLAS-CONF-2019-011]. These likelihoods adhere to a specification first defined by the...
In this paper we present the CERN Analysis Preservation service as a FAIR (Findable, Accessible, Interoperable and Reusable) research data preservation repository platform for LHC experiments. The CERN Analysis Preservation repository allows LHC collaborations to deposit and share the structured information about analyses as well as to capture the individual data assets associated to the...
We present a major overhaul to lepton identification for the Belle II experiment, based on a novel multi-variate classification algorithm.
A key topic in the Belle II physics programme is the study of semi-tauonic B decays to test lepton flavour universality violation, such as $B\rightarrow D^{*}\tau\nu$. The analysis of this decay relies on the capability of correctly separating low...
The Recursive Jigsaw Reconstruction method is a technique to analyze reconstructed particles in the presence of kinematic and combinatoric unknowns which are associated with unmeasured or indistinguishable particles. By factorizing the unknowns according to an assumed topology and applying fixed algorithmic choices - Jigsaw Rules, we are able to approximately reconstruct rest frames throughout...
We propose a new search strategy, based on deep-learning (DL) anomaly detection, to search for new physics in all-jet final states without specific assumptions. The DL model identifies events with anomalous radiation pattern in the jets. This is done applying a threshold to the reconstruction loss. The threshold is tuned so that the rejected events provide an estimate of the QCD-background...
HEP event selection is traditionally considered a binary classification problem, involving the dichotomous categories of signal and background. In distribution fits for particle masses or couplings, however, signal events are not all equivalent, as the signal differential cross section has different sensitivities to the measured parameter in different regions of phase space. In this talk, I...
The main purpose of modern experiments with heavy ions is a comprehensive study of the QCD phase diagram in the field of quark-gluon plasma (QGP) and the possible phase transition to the QGP phase.
One of the possible signals of QGP formation is an increase in the production of strange particles. Reconstruction of $\Sigma$ particles together with other strange particles completes the picture...
MATHUSLA has been proposed as a detector that sits over 100 m from an LHC interaction point, on the surface, to look for ultra long-lived particles. A test stand was constructed with two layers of scintillator paddles and six layers of RPCs, on loan from the DZERO and the Argo-YBJ experiments. Downward and upward going tracks from cosmic ray data and muons from the interaction point have been...