Conveners
Track 6 – Physics Analysis: Analysis Tools & Methods
- Ross Young (University of Adelaide)
Track 6 – Physics Analysis: New Physics studies
- Maurizio Pierini (CERN)
Track 6 – Physics Analysis: Framework
- Maurizio Pierini (CERN)
Track 6 – Physics Analysis: Lattice QCD
- Phiala Shanahan (Massachusetts Institute of Technology)
Track 6 – Physics Analysis: Machine Learning for data analysis
- Maurizio Pierini (CERN)
Track 6 – Physics Analysis: Pheno fits / Analysis preservation
- Martin White (University of Adelaide (AU))
Track 6 – Physics Analysis: Event Reconstruction and Selection methods
- Maurizio Pierini (CERN)
-
Jonas Eschle (Universitaet Zuerich (CH))04/11/2019, 11:00Track 6 – Physics AnalysisOral
Statistical modelling is a key element for High-Energy Physics (HEP) analysis. Currently, most of this modelling is performed with the ROOT/RooFit toolkit which is written in C++ and provides Python bindings which are only loosely integrated into the scientific Python ecosystem. We present zfit, a new alternative to RooFit, written in pure Python. Built on top of TensorFlow (a modern, high...
Go to contribution page -
Stefan Wunsch (KIT - Karlsruhe Institute of Technology (DE))04/11/2019, 11:15Track 6 – Physics AnalysisOral
ROOT provides, through TMVA, machine learning tools for data analysis at HEP experiments and beyond. In this talk, we present recently included features in TMVA and the strategy for future developments in the diversified machine learning landscape. Focus is put on fast machine learning inference, which enables analysts to deploy their machine learning models rapidly on large scale datasets....
Go to contribution page -
Jakub Moscicki (CERN)04/11/2019, 11:30Track 6 – Physics AnalysisOral
SWAN (Service for Web-based ANalysis) is a CERN service that allows users to perform interactive data analysis in the cloud, in a "software as a service" model. The service is a result of the collaboration between IT Storage and Databases groups and EP-SFT group at CERN. SWAN is built upon the widely-used Jupyter notebooks, allowing users to write - and run - their data analysis using only a...
Go to contribution page -
Benjamin Krikler (University of Bristol (GB))04/11/2019, 11:45Track 6 – Physics AnalysisOral
The Faster Analysis Software Taskforce (FAST) is a small, European group of HEP researchers that have been investigating and developing modern software approaches to improve HEP analyses. We present here an overview of the key product of this effort: a set of packages that allows a complete implementation of an analysis using almost exclusively YAML files. Serving as an analysis description...
Go to contribution page -
Ursula Laa (Monash University)04/11/2019, 12:00Track 6 – Physics AnalysisOral
In physics we often encounter high-dimensional data, in the form of multivariate measurements or of models with multiple free parameters. The information encoded is increasingly explored using machine learning, but is not typically explored visually. The barrier tends to be visualising beyond 3D, but systematic approaches for this exist in the statistics literature. I will use examples from...
Go to contribution page -
Dr Carsten Daniel Burgard (Nikhef National institute for subatomic physics (NL))04/11/2019, 12:15Track 6 – Physics AnalysisOral
RooFit is the statistical modeling and fitting package used in many experiments to extract physical parameters from reduced particle collision data. RooFit aims to separate particle physics model building and fitting (the users' goals) from their technical implementation and optimization in the back-end. In this talk, we outline our efforts to further optimize the back-end by automatically...
Go to contribution page -
Riley Patrick (The University of Adelaide)04/11/2019, 14:00Track 6 – Physics AnalysisOral
In this talk I will present an investigation into sizeable interference effects between a {heavy} charged Higgs boson signal produced via $gg\to t\bar b H^-$ (+ c.c.) followed by the decay $H^-\to b\bar t$ (+ c.c.) and the irreducible background given by $gg\to t\bar t b \bar b$ topologies at the Large Hadron Collider (LHC). I will show how such effects could spoil current $H^\pm$...
Go to contribution page -
Pat Scott (The University of Queensland)04/11/2019, 14:15Track 6 – Physics AnalysisOral
GAMBIT is a modular and flexible framework for performing global fits to a wide range of theories for new physics. It includes theory and analysis calculations for direct production of new particles at the LHC, flavour physics, dark matter experiments, cosmology and precision tests, as well as an extensive library of advanced parameter-sampling algorithms. I will present the GAMBIT software...
Go to contribution page -
Benjamin Roberts (The University of Queensland)04/11/2019, 14:30Track 6 – Physics AnalysisOral
Despite the overwhelming cosmological evidence for the existence of dark matter, and the considerable effort of the scientific community over decades, there is no evidence for dark matter in terrestrial experiments.
Go to contribution page
The GPS.DM observatory uses the existing GPS constellation as a 50,000 km-aperture sensor array, analysing the satellite and terrestrial atomic clock data for exotic physics... -
Dr wei su (University of Adelaide)04/11/2019, 14:45Track 6 – Physics AnalysisOral
In this talk, we discuss the new physics implication in Two Higgs doublet Model (2HDM) under various experimental constraints. As part work of Gambit group, our work is to use the global fit method to constrain the parameter space, find out the hints for new physics and try to make some predictions for further studies.
In our global fit, we include the constraints from LEP, LHC (SM-like...
Go to contribution page -
Koji Hara (KEK)04/11/2019, 15:00Track 6 – Physics AnalysisOral
At the high luminosity flavor factory experiments such as the Belle II
Go to contribution page
experiment, it is expected to find the new physics effect and
constrain the new physics models with the high statics and many
observables. In such analysis, the global analysis of the many
observables with the model-independent approach is important. One
difficulty in such global analysis is that the new physics... -
Martin John White (University of Adelaide (AU))04/11/2019, 15:15Track 6 – Physics AnalysisOral
Searches for beyond-Standard Model physics at the LHC have thus far not uncovered any evidence of new particles, and this is often used to state that new particles with low mass are now excluded. Using the example of the supersymmetric partners of the electroweak sector of the Standard Model, I will present recent results from the GAMBIT collaboration that show that there is plenty of room for...
Go to contribution page -
Gordon Watts (University of Washington (US))05/11/2019, 11:00Track 6 – Physics AnalysisOral
The increase in luminosity by a factor of 100 for the HL-LHC with respect to Run 1 poses a big challenge from the data analysis point of view. It demands a comparable improvement in software and processing infrastructure. The use of GPU enhanced supercomputers will increase the amount of computer power and analysis languages will have to be adapted to integrate them. The particle physics...
Go to contribution page -
Johannes Elmsheuser (Brookhaven National Laboratory (US))05/11/2019, 11:15Track 6 – Physics AnalysisOral
With an increased dataset obtained during CERN LHC Run-2, the even larger forthcoming Run-3 data and more than an order of magnitude expected increase for HL-LHC, the ATLAS experiment is reaching the limits of the current data production model in terms of disk storage resources. The anticipated availability of an improved fast simulation will enable ATLAS to produce significantly larger Monte...
Go to contribution page -
Gene Van Buren (Brookhaven National Laboratory)05/11/2019, 11:30Track 6 – Physics AnalysisOral
For the last 5 years Accelogic pioneered and perfected a radically new theory of numerical computing codenamed “Compressive Computing”, which has an extremely profound impact on real-world computer science. At the core of this new theory is the discovery of one of its fundamental theorems which states that, under very general conditions, the vast majority (typically between 70% and 80%) of the...
Go to contribution page -
Giulio Eulisse (CERN)05/11/2019, 11:45Track 6 – Physics AnalysisOral
ALICE Experiment is currently undergoing a major upgrade program, both in
Go to contribution page
terms of hardware and software, to prepare for the LHC Run 3. A new Software
Framework is being developed in collaboration with the FAIR experiments at GSI
to cope with the 100 fold increase in collected collisions.
We present our progress to adapt such a framework for the end user physics data
analysis. In... -
Eduardo Rodrigues (University of Cincinnati (US))05/11/2019, 12:00Track 6 – Physics AnalysisOral
Scikit-HEP is a community-driven and community-oriented project with the goal of providing an ecosystem for particle physics data analysis in Python. Scikit-HEP is a toolset of approximately twenty packages and a few “affiliated” packages. It expands the typical Python data analysis tools for particle physicists. Each package focuses on a particular topic, and interacts with other packages in...
Go to contribution page -
Nick Smith (Fermi National Accelerator Lab. (US))05/11/2019, 12:15Track 6 – Physics AnalysisOral
The COFFEA Framework provides a new approach to HEP analysis, via columnar operations, that improves time-to-insight, scalability, portability, and reproducibility of analysis. It is implemented with the Python programming language and commodity big data technologies such as Apache Spark and NoSQL databases. To achieve this suite of improvements across many use cases, COFFEA takes a factorized...
Go to contribution page -
Sanjay Bloor (Imperial College London)05/11/2019, 14:00Track 6 – Physics AnalysisOral
GUM is a new feature of the GAMBIT global fitting software framework, which provides a direct interface between Lagrangian level tools and GAMBIT. GUM automatically writes GAMBIT routines to compute observables and likelihoods for physics beyond the Standard Model. I will describe the structure of GUM, the tools (within GAMBIT) it is able to create interfaces to, and the observables it is able...
Go to contribution page -
Dr William Detmold (MIT)05/11/2019, 14:15Track 6 – Physics AnalysisOral
I will discuss recent advances in lattice QCD from the physics and computational points of view that have enabled basic a number properties and interactions of light nuclei to be determined directly from QCD. These calculations offer the prospect of providing nuclear matrix inputs necessary for a range of intensity frontier experiments (DUNE, mu2e) and dark matter direct-detection experiments...
Go to contribution page -
Ryan Bignell (University of Adelaide)05/11/2019, 14:30Track 6 – Physics AnalysisOral
Background field methods offer an approach through which fundamental non-perturbative hadronic properties can be studied. Lattice QCD is the only ab initio method with which Quantum Chromodynamics can be studied at low energies; it involves numerically calculating expectation values in the path integral formalism. This requires substantial investment in high performance super computing...
Go to contribution page -
Alex Westin (The University of Adelaide)05/11/2019, 14:45Track 6 – Physics AnalysisOral
There exists a long standing discrepancy of around 3.5 sigma between experimental measurements and standard model calculations of the magnetic moment of the muon. Current experiments aim to reduce the experimental uncertainty by a factor of 4, and Standard Model calculations must also be improved by a similar order. The largest uncertainty in the Standard Model calculation comes from the QCD...
Go to contribution page -
Tomas Howson (University of Adelaide)05/11/2019, 15:00Track 6 – Physics AnalysisOral
Computing the gluon component of momentum in the nucleon is a difficult and computationally expensive problem, as the matrix element involves a quark-line-disconnected gluon operator which suffers from ultra-violet fluctuations. But also necessary for a successful determination is the non-perturbative renormalisation of this operator. We investigate this renormalisation here by direct...
Go to contribution page -
Adam Virgili05/11/2019, 15:15Track 6 – Physics AnalysisOral
The origin of the low-lying nature of the Roper resonance has been the subject of significant interest for many years, including several investigations using lattice QCD. It has been claimed that chiral symmetry plays an important role in our understanding of this resonance. We present results from our systematic examination of the potential role of chiral symmetry in the low-lying nucleon...
Go to contribution page -
Adam Leinweber (Univeristy of Adelaide)05/11/2019, 16:30Track 6 – Physics AnalysisOral
Recent searches for supersymmetric particles at the Large Hadron Collider have been unsuccessful in detecting any BSM physics. This is partially because the exact masses of supersymmetric particles are not known, and as such, searching for them is very difficult. The method broadly used in searching for new physics requires one to optimise on the signal being searched for, potentially...
Go to contribution page -
Thomas Britton (JLab)05/11/2019, 16:45Track 6 – Physics AnalysisOral
Charged particle tracking represents the largest consumer of CPU resources in high data volume Nuclear Physics experiments. An effort is underway to develop ML networks that will reduce the resources required for charged particle tracking. Tracking in NP experiments represent some unique challenges compared to HEP. In particular, track finding typically represents only a small fraction of the...
Go to contribution page -
Mr Dennis Noll (RWTH Aachen University (DE))05/11/2019, 17:00Track 6 – Physics AnalysisOral
For physics analyses with identical final state objects, e.g. jets, the correct sorting of the objects at the input of the analysis can lead to a considerable performance increase.
We present a new approach in which a sorting network is placed upstream of a classification network. The sorting network combines the whole event information and explicitly pre-sorts the inputs of the analysis....
Go to contribution page -
Matthias Komm (Imperial College (GB))05/11/2019, 17:15Track 6 – Physics AnalysisOral
We present preliminary studies of a deep neural network (DNN) "tagger" that is trained to identify the presence of displaced jets arising from the decays of new long-lived particle (LLP) states in data recorded by the CMS detector at the CERN LHC. Particle-level candidates, as well as secondary vertex information, are refined through the use of convolutional neural networks (CNNs) before being...
Go to contribution page -
Mr Kevin Greif (University of Notre Dame)05/11/2019, 17:30Track 6 – Physics AnalysisOral
Deep neural networks (DNNs) have been applied to the fields of computer vision and natural language processing with great success in recent years. The success of these applications has hinged on the development of specialized DNN architectures that take advantage of specific characteristics of the problem to be solved, namely convolutional neural networks for computer vision and recurrent...
Go to contribution page -
Dr Nobuo Sato (Jefferson Lab), nobuo sato (Florida State University)05/11/2019, 17:45Track 6 – Physics AnalysisOral
We describe a multi-disciplinary project to use machine learning techniques based on neural networks (NNs) to construct a Monte Carlo event generator for lepton-hadron collisions that is agnostic of theoretical assumptions about the microscopic nature of particle reactions. The generator, referred to as ETHER (Empirically Trained Hadronic Event Regenerator), is trained to experimental data...
Go to contribution page -
Alexander Held (University of British Columbia (CA))07/11/2019, 11:00Track 6 – Physics AnalysisOral
An important part of the LHC legacy will be precise limits on indirect effects of new physics, framed for instance in terms of an effective field theory. These measurements often involve many theory parameters and observables, which makes them challenging for traditional analysis methods. We discuss the underlying problem of “likelihood-free” inference and present powerful new analysis...
Go to contribution page -
Dr Wally Melnitchouk (Jefferson Lab)07/11/2019, 11:15Track 6 – Physics AnalysisOral
Extracting information about the quark and gluon (or parton) structure of the nucleon from high-energy scattering data is a classic example of the inverse problem: the experimental cross sections are given by convolutions of the parton probability distributions with process-dependent hard coefficients that are perturbatively calculable from QCD. While most analyses in the past have been based...
Go to contribution page -
Yang Zhang07/11/2019, 11:30Track 6 – Physics AnalysisOral
Phase transitions played an important role in the very early evolution of the Universe. We present a C++ software package (PhaseTracer) for finding cosmological phases and calculating transition properties involving single or multiple scalar fields. The package first maps the phase structure by tracing the vacuum expectation value (VEV) of the potential at different temperatures, then finds...
Go to contribution page -
Lara Lloret Iglesias (CSIC - Consejo Sup. de Investig. Cientif. (ES))07/11/2019, 11:45Track 6 – Physics AnalysisOral
The CERN analysis preservation portal (CAP) comprises a set of tools and services aiming to assist researchers in describing and preserving all the components of a physics analysis such as data, software and computing environment. Together with the associated documentation, all these assets are kept in one place so that the analysis can be fully or partially reused even several years after the...
Go to contribution page -
Matthew Feickert (Southern Methodist University (US))07/11/2019, 12:00Track 6 – Physics AnalysisOral
Likelihoods associated with statistical fits in searches for new physics are beginning to be published by LHC experiments on HEPData [arXiv:1704.05473]. The first of these is the search for bottom-squark pair production by ATLAS [ATLAS-CONF-2019-011]. These likelihoods adhere to a specification first defined by the...
Go to contribution page -
420. CERN analysis preservation and reuse framework: FAIR research data services for LHC experimentsPamfilos Fokianos (CERN)07/11/2019, 12:15Track 6 – Physics AnalysisOral
In this paper we present the CERN Analysis Preservation service as a FAIR (Findable, Accessible, Interoperable and Reusable) research data preservation repository platform for LHC experiments. The CERN Analysis Preservation repository allows LHC collaborations to deposit and share the structured information about analyses as well as to capture the individual data assets associated to the...
Go to contribution page -
Dr Marco Milesi (The University of Melbourne, Belle II Experiment)07/11/2019, 14:00Track 6 – Physics AnalysisOral
We present a major overhaul to lepton identification for the Belle II experiment, based on a novel multi-variate classification algorithm.
A key topic in the Belle II physics programme is the study of semi-tauonic B decays to test lepton flavour universality violation, such as $B\rightarrow D^{*}\tau\nu$. The analysis of this decay relies on the capability of correctly separating low...
Go to contribution page -
Jason Oliver (University of Adelaide (AU))07/11/2019, 14:15Track 6 – Physics AnalysisOral
The Recursive Jigsaw Reconstruction method is a technique to analyze reconstructed particles in the presence of kinematic and combinatoric unknowns which are associated with unmeasured or indistinguishable particles. By factorizing the unknowns according to an assumed topology and applying fixed algorithmic choices - Jigsaw Rules, we are able to approximately reconstruct rest frames throughout...
Go to contribution page -
Kinga Anna Wozniak (University of Vienna (AT))07/11/2019, 14:30Track 6 – Physics AnalysisOral
We propose a new search strategy, based on deep-learning (DL) anomaly detection, to search for new physics in all-jet final states without specific assumptions. The DL model identifies events with anomalous radiation pattern in the jets. This is done applying a threshold to the reconstruction loss. The threshold is tuned so that the rejected events provide an estimate of the QCD-background...
Go to contribution page -
Andrea Valassi (CERN)07/11/2019, 14:45Track 6 – Physics AnalysisOral
HEP event selection is traditionally considered a binary classification problem, involving the dichotomous categories of signal and background. In distribution fits for particle masses or couplings, however, signal events are not all equivalent, as the signal differential cross section has different sensitivities to the measured parameter in different regions of phase space. In this talk, I...
Go to contribution page -
Prof. Ivan Kisel (Johann-Wolfgang-Goethe Univ. (DE))07/11/2019, 15:00Track 6 – Physics AnalysisOral
The main purpose of modern experiments with heavy ions is a comprehensive study of the QCD phase diagram in the field of quark-gluon plasma (QGP) and the possible phase transition to the QGP phase.
One of the possible signals of QGP formation is an increase in the production of strange particles. Reconstruction of $\Sigma$ particles together with other strange particles completes the picture...
Go to contribution page -
Gordon Watts (University of Washington (US))07/11/2019, 15:15Track 6 – Physics AnalysisOral
MATHUSLA has been proposed as a detector that sits over 100 m from an LHC interaction point, on the surface, to look for ultra long-lived particles. A test stand was constructed with two layers of scintillator paddles and six layers of RPCs, on loan from the DZERO and the Argo-YBJ experiments. Downward and upward going tracks from cosmic ray data and muons from the interaction point have been...
Go to contribution page