10โ€“15 Mar 2019
Steinmatte conference center
Europe/Zurich timezone

Session

Track 2: Data Analysis - Algorithms and Tools

Track 2
11 Mar 2019, 15:30
Steinmatte conference center

Steinmatte conference center

Hotel Allalin, Saas Fee, Switzerland https://allalin.ch/conference/

Conveners

Track 2: Data Analysis - Algorithms and Tools

  • Tommaso Dorigo (Universita e INFN, Padova (IT))
  • David Rousseau (LAL-Orsay, FR)

Track 2: Data Analysis - Algorithms and Tools

  • David Rousseau (LAL-Orsay, FR)
  • Tommaso Dorigo (Universita e INFN, Padova (IT))

Track 2: Data Analysis - Algorithms and Tools

  • Jean-Roch Vlimant (California Institute of Technology (US))
  • Andy Buckley (University of Glasgow (GB))

Track 2: Data Analysis - Algorithms and Tools

  • Andy Buckley (University of Glasgow (GB))
  • Jean-Roch Vlimant (California Institute of Technology (US))

Track 2: Data Analysis - Algorithms and Tools

  • Jennifer Ngadiuba (CERN)
  • Wouter Verkerke (Nikhef National institute for subatomic physics (NL))

Track 2: Data Analysis - Algorithms and Tools

  • Jennifer Ngadiuba (CERN)
  • Wouter Verkerke (Nikhef National institute for subatomic physics (NL))

Track 2: Data Analysis - Algorithms and Tools

  • Oleg Kalashev (Institute for Nuclear Research RAS)
  • Kazuhiro Terao (SLAC)

Track 2: Data Analysis - Algorithms and Tools

  • Kazuhiro Terao (SLAC)
  • Oleg Kalashev (Institute for Nuclear Research RAS)

Presentation materials

There are no materials yet.

  1. Mario Masciovecchio (Univ. of California San Diego (US))
    11/03/2019, 15:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    In the High-Luminosity Large Hadron Collider (HL-LHC), one of the most challenging computational problems is expected to be finding and fitting charged-particle tracks during event reconstruction. The methods currently in use at the LHC are based on the Kalman filter. Such methods have shown to be robust and to provide good physics performance, both in the trigger and offline. In order to...

    Go to contribution page
  2. Marko Petric (CERN)
    11/03/2019, 16:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    ConformalTracking is an open source library created in 2015 to serve as a detector independent solution for track reconstruction in detector development studies at CERN. Pattern recognition is one of the most CPU intensive tasks of event reconstruction at present and future experiments. Current tracking programs of the LHC experiments are mostly tightly linked to individual detector...

    Go to contribution page
  3. Dr Jean-Roch Vlimant (California Institute of Technology (US))
    11/03/2019, 16:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    To address the unprecedented scale of HL-LHC data, the HEP.TrkX project has been investigating a variety of machine learning approaches to particle track reconstruction. The most promising of these solutions, a graph neural network, processes the event as a graph that connects track measurements (detector hits corresponding to nodes) with candidate line segments between the hits (corresponding...

    Go to contribution page
  4. Sebastian Skambraks (Max-Planck-Institut fรผr Physik)
    11/03/2019, 16:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Machine learning methods are integrated into the pipelined first level track trigger of the upgraded flavor physics experiment Belle II in Tsukuba, Japan. The novel triggering techniques cope with the severe background conditions coming along with the upgrade of the instantaneous luminosity by a factor of 40 to $\mathcal{L} = 8 \times 10^{35} \text{cm}^{โˆ’2} \text{s}^{โˆ’1}$. Using the precise...

    Go to contribution page
  5. Dr Jean-Roch Vlimant (California Institute of Technology (US))
    11/03/2019, 17:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    With the upgrade of the LHC to high luminosity, an increased rate of collisions will place a higher computational burden on track reconstruction algorithms. Typical algorithms such as the Kalman Filter and Hough-like Transformation scale worse than quadratically. However, the energy function of a traditional method for tracking, the geometric Denby-Peterson (Hopfield) network method, can be...

    Go to contribution page
  6. Jennifer Ngadiuba (CERN)
    11/03/2019, 18:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Machine learning is becoming ubiquitous across HEP. There is great potential to improve trigger and DAQ performances with it. However, the exploration of such techniques within the field in low latency/power FPGAs has just begun. We present hls4ml, a user-friendly software, based on High-Level Synthesis (HLS), designed to deploy network architectures on FPGAs. As a case study, we use hls4ml...

    Go to contribution page
  7. Michael J. Morello (SNS and INFN-Pisa (IT)), Riccardo Cenci (Universita & INFN Pisa (IT)), Mr Andrea Di Luca (Universita degli Studi di Trento and INFN (IT)), Federico Lazzari (Universita & INFN Pisa (IT)), Giovanni Punzi (Universita & INFN Pisa (IT))
    11/03/2019, 18:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Finding tracks downstream of the magnet at the earliest LHCb trigger level is not part of the baseline plan of the Upgrade trigger, on account of the significant CPU time required to execute the search. Many long-lived particles, such as Ks and strange baryons, decay after the vertex track detector (VELO), so that their reconstruction efficiency is limited. We present a study of the...

    Go to contribution page
  8. Michael David Sokoloff (University of Cincinnati (US))
    11/03/2019, 18:40
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    In the transition to Run 3 in 2021, LHCb will undergo a major luminosity upgrade, going from 1.1 to 5.6 expected visible Primary Vertices (PVs) per event, and will adopt a purely software trigger. This has fueled increased interest in alternative highly-parallel and GPU friendly algorithms for tracking and reconstruction. We will present a novel prototype algorithm for vertexing in the LHCb...

    Go to contribution page
  9. James Kahn (Karlsruhe Institute of Technology (KIT)), Martin Ritter
    11/03/2019, 19:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The Belle II experiment, beginning data taking with the full detector in early 2019, is expected to produce a volume of data fifty times that of its predecessor. With this dramatic increase in data comes the opportunity for studies of rare previously inaccessible processes. The investigation of such rare processes in a high data volume environment requires a correspondingly high volume of...

    Go to contribution page
  10. Thong Nguyen (California Institute of Technology (US))
    12/03/2019, 16:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Generative models, and in particular generative adversarial networks, are gaining momentum in hep as a possible way to speed up the event simulation process. Traditionally, gan models applied to hep are designed to return images. On the other hand, many applications (e.g., analyses based on particle flow) are designed to take as input lists of particles. We investigate the possibility of using...

    Go to contribution page
  11. Vladislav Belavin (CERN)
    12/03/2019, 16:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    At this moment the most convenient approach in electromagnetic shower generation is Monte-Carlo simulation produced by software packages like GEANT4. However, one of the critical problems of Monte-Carlo production is that it is extremely slow since it involves simulation of numerous subatomic interactions.

    Recently, generative adversarial networks(GANs) addressed speed issue in the simulation...

    Go to contribution page
  12. Artem Maevskiy (National Research University Higher School of Economics (RU))
    12/03/2019, 16:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The increasing luminosities of future LHC runs and next generation of collider experiments will require an unprecedented amount of simulated events to be produced. Such large scale productions are extremely demanding in terms of computing resources. Thus new approaches to event generation and simulation of detector responses are needed. In LHCb the simulation of the RICH detector using the...

    Go to contribution page
  13. Philipp Do Nascimento Gaspar (Federal University of of Rio de Janeiro (BR))
    12/03/2019, 17:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    An extensive upgrade programme has been developed for LHC and its experiments, which is crucial to allow the complete exploitation of the extremely high-luminosity collision data. The programme is staggered in two phases, so that the main interventions are foreseen in Phase II.
    For this second phase, the main hadronic calorimeter of ATLAS (TileCal) will redesign its readout electronics but the...

    Go to contribution page
  14. Dr Torben Ferber (DESY)
    12/03/2019, 18:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The Belle II experiment at the SuperKEKB e+e- collider has completed its first-collisions run in 2018. The experiment is currently preparing for physics data taking in 2019. The electromagnetic calorimeter of the Belle II detector consists of 8,736 Thallium-doped CsI crystals with PIN-photodiode readout. Each crystal is equipped with waveform digitizers that allow the extraction of energy,...

    Go to contribution page
  15. Dayane Gonรงalves (Universidade Federal de Juiz de Fora)
    12/03/2019, 18:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The ATLAS experiment records data from the proton-proton collisions produced by the Large Hadron Collider (LHC). The Tile Calorimeter is the hadronic sampling calorimeter of ATLAS in the region |ฮท| < 1.7. It uses iron absorbers and scintillators as active material. Jointly with the other calorimeters it is designed for reconstruction of hadrons, jets, tau-particles and missing transverse...

    Go to contribution page
  16. Frederic Alexandre Dreyer (Oxford)
    12/03/2019, 18:40
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    We introduce a novel implementation of a reinforcement learning
    algorithm which is adapted to the problem of jet grooming, a
    crucial component of jet physics at hadron colliders. We show
    that the grooming policies trained using a Deep Q-Network model
    outperform state-of-the-art tools used at the LHC such as
    Recursive Soft Drop, allowing for improved resolution of the mass
    of boosted objects....

    Go to contribution page
  17. Yannik Alexander Rath (RWTH Aachen University (DE))
    12/03/2019, 19:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    A large part of the success of deep learning in computer science can be attributed to the introduction of dedicated architectures exploiting the underlying structure of a given task. As deep learning methods are adopted for high energy physics, increasing attention is thus directed towards the development of new models incorporating physical knowledge.

    In this talk, we present a network...

    Go to contribution page
  18. Leo Piilonen (Virginia Tech)
    12/03/2019, 19:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    I describe a novel interactive virtual reality visualization of the Belle II detector at KEK and the animation therein of GEANT4-simulated event histories. Belle2VR runs on Oculus and Vive headsets (as well as in a web browser and on 2D computer screens, in the absence of a headset). A user with some particle-physics knowledge manipulates a gamepad or hand controller(s) to interact with and...

    Go to contribution page
  19. Thomas Alef (University of Bonn (DE))
    13/03/2019, 15:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Multivariate analyses in particle physics often reach a precision such that its uncertainties are dominated by systematic effects. While there are known strategies to mitigate systematic effects based on adversarial neural nets, the application of Boosted Decision Trees (BDT) so far had to ignore systematics in the training.
    We present a method to incorporate systematic uncertainties into a...

    Go to contribution page
  20. Mr Nikita Kazeev (Yandex School of Data Analysis (RU))
    13/03/2019, 15:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Analysis in high-energy physics usually deals with data samples populated from different sources. One of the most widely used ways to handle this is the sPlot technique. In this technique the results of a maximum likelihood fit are used to assign weights that can be used to disentangle signal from background. Some events are assigned negative weights, which makes it difficult to apply machine...

    Go to contribution page
  21. Benjamin Fischer (Rheinisch Westfaelische Tech. Hoch. (DE))
    13/03/2019, 16:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Variable-dependent scale factors are commonly used in HEP to improve shape agreement of data and simulation. The choice of the underlying model is of great importance, but often requires a lot of manual tuning e.g. of bin sizes or fitted functions. This can be alleviated through the use of neural networks and their inherent powerful data modeling capabilities.
    We present a novel and...

    Go to contribution page
  22. Pablo de Castro (Universita e INFN, Padova (IT))
    13/03/2019, 16:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Complex computer simulations are commonly required for accurate data modelling in many scientific disciplines, including experimental High Energy Physics, making statistical inference challenging due to the intractability of the likelihood evaluation for the observed data. Furthermore, sometimes one is interested on inference drawn over a subset of the generative model parameters while taking...

    Go to contribution page
  23. Andreas Sogaard (University of Edinburgh (GB))
    13/03/2019, 16:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    A large number of physics processes as seen by ATLAS at the LHC manifest as collimated, hadronic sprays of particles known as โ€˜jetsโ€™. Jets originating from the hadronic decay of a massive particle are commonly used in searches for both measurements of the Standard Model and searches for new physics. The ATLAS experiment has employed machine learning discriminants to the challenging task of...

    Go to contribution page
  24. Dr steven prohira (The Ohio State University)
    13/03/2019, 17:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    In radio-based physics experiments, sensitive analysis techniques are often required to extract signals at or below the level of noise. For a recent experiment at the SLAC National Accelerator Laboratory to test a radar-based detection scheme for high energy neutrino cascades, such a sensitive analysis was employed to dig down into a spurious background and extract a signal. This analysis...

    Go to contribution page
  25. Andy Buckley (University of Glasgow (GB))
    13/03/2019, 18:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The Gambit collaboration is a new effort in the world of global BSM fitting -- the combination of the largest possible set of observational data from across particle, astro, and nuclear physics to gain a synoptic view of what experimental data has to say about models of new physics. Using a newly constructed, open source code framework, Gambit have released several state-of-the-art scans of...

    Go to contribution page
  26. Mr Victor Estrade (LRI)
    13/03/2019, 18:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Data analysis based on forward simulation often require the use of a machine learning model for statistical inference of the parameters of interest.
    Most of the time these learned model are trained to discriminate events between backgrounds and signals to produce a 1D score, which is used to select a relatively pure signal region.
    The training of the model does not take into account the final...

    Go to contribution page
  27. Lukas Alexander Heinrich (New York University (US))
    13/03/2019, 18:40
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    A common goal in the search for new physics is the determination of sets of New Physics models, typically parametrized by a number of parameters such as masses or couplings, that are either compatible with the observed data or excluded by it, where the determination into which category a given model belong requires expensive computation of the expected signal. This problem may be abstracted...

    Go to contribution page
  28. Dr William Lawrence Sutcliffe (Karlsruhe Institute of Technology (DE))
    13/03/2019, 19:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The Belle II experiment is an e+e- collider experiment in Japan, which
    begins its main physics run in early 2019. The clean environment of e+e-
    collisions together with the unique event topology of Belle II, in which
    an ฮฅ(4S) particle is produced and subsequently decays to a pair of B
    mesons, allows a wide range of physics measurements to be performed
    which are difficult or impossible at...

    Go to contribution page
  29. Sergey Shirobokov (Imperial College (GB))
    14/03/2019, 15:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    We investigate the problem of dark matter detection in emulsion detector. Previously we have shown, that it is very challenging but possible to use emulsion films of OPERA-like detector in SHiP experiment to separate electromagnetic showers from each other, thus hypothetically separating neutrino events from dark matter. In this study, we have investigated the possibility of usage of Target...

    Go to contribution page
  30. Mr Constantin Steppa (University of Potsdam)
    14/03/2019, 16:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Ground-based $\gamma$-ray astronomy relies on reconstructing primary particles' properties from the measurement of the induced air showers. Currently, template fitting is the state-of-the-art method to reconstruct air showers. CNNs represent promising means to improve on this method in both, accuracy and computational cost. Promoted by the availability of inexpensive hardware and open-source...

    Go to contribution page
  31. Jonas Glombitza (RWTH Aachen)
    14/03/2019, 16:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    In recent years, the astroparticle physics community has successfully adapted supervised learning algorithms for a wide range of tasks, including event reconstruction in cosmic ray observatories[1], photon identification at Cherenkov telescopes[2], and the extraction of gravitational wave signals from time traces[3]. In addition, first unsupervised learning approaches of generative models at...

    Go to contribution page
  32. Laura Domine (Stanford University/SLAC)
    14/03/2019, 16:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    From a breakthrough revolution, Deep Learning (DL) has grown to become a de-facto standard technique in the fields of artificial intelligence and computer vision. In particular Convolutional Neural Networks (CNNs) are shown to be a powerful DL technique to extract physics features from images: They were successfully applied to the data reconstruction and analysis of Liquid Argon Time...

    Go to contribution page
  33. Gevy Cao (Queen's University)
    14/03/2019, 18:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    PICO is a dark matter experiment using superheated bubble chamber technology. One of the main analysis challenges in PICO is to unambiguously distinguish between background events and nuclear recoil events from possible WIMP scatters. The conventional discriminator, acoustic parameter (AP), utilizes frequency analysis in Fourier space to compute the acoustic power, which is proven to be...

    Go to contribution page
  34. Dennis Noll (Rheinisch Westfaelische Tech. Hoch. (DE))
    14/03/2019, 18:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Deep learning architectures in particle physics are often strongly dependent on the order of their input variables. We present a two-stage deep learning architecture consisting of a network for sorting input objects and a subsequent network for data analysis. The sorting network (agent) is trained through reinforcement learning using feedback from the analysis network (environment). A tree...

    Go to contribution page
  35. Artem Ryzhikov (Yandex School of Data Analysis (RU))
    14/03/2019, 18:40
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Accurate particle identification (PID) is one of the most important aspects of the LHCb experiment. Modern machine learning techniques such as deep neural networks are efficiently applied to this problem and are integrated into the LHCb software. In this research, we discuss novel applications of neural network speed-up techniques to achieve faster PID in LHC upgrade conditions. We show that...

    Go to contribution page
  36. Olmo Cerri (California Institute of Technology (US))
    14/03/2019, 19:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Using variational autoencoders trained on known physics processes, we develop a one-side p-value test to isolate previously unseen event topologies as outlier events. Since the autoencoder training does not depend on any specific new physics signature, the proposed procedure has a weak dependence on underlying assumptions about the nature of new physics. An event selection based on this...

    Go to contribution page
  37. Wahid Bhimji (Lawrence Berkeley National Lab. (US))
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    We present recent work in deep learning for particle physics and cosmology at NERSC, the US Dept. of Energy mission HPC centre. We will describe activity in new methods and applications; distributed training across HPC resources; and plans for accelerated hardware for deep learning in NERSC-9 (Perlmutter) and beyond.
    Some of the HEP methods and applications showcased include conditional...

    Go to contribution page
  38. Konstantin Malanchev
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The next generation of astronomical surveys will revolutionize our understanding of the Universe, raising unprecedented data challenges in the process. One of them is the impossibility to rely on human scanning for the identification of unusual/unpredicted astrophysical objects. Moreover, given that most of the available data will be in the form of photometric observations, such...

    Go to contribution page
  39. Sydney Otten (Radboud Universiteit Nijmegen)
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Although the standard model of particle physics is successful in describing physics as we know it, it is known to be incomplete. Many models have been developed to extend the standard model, none of which have been experimentally verified. One of the main hurdles in this effort is the dimensionality of these models, yielding problems in analysing, visualising and communicating results. Because...

    Go to contribution page
  40. Antoni Shtipliyski (Imperial College (GB))
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The High-Luminosity upgrade of LHC (HL-LHC) is expected to deliver a total luminosity of 3000 fb$^{-1}$ to the general purpose experiments. This will allow the measurement of Standard Model processes with unprecedented precision, and will significantly increase the reach of searches for new physics. Higher data rates and increased radiation levels will require substantial upgrades to the...

    Go to contribution page
  41. Heather Gray (LBNL)
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Universal Quantum Computing may still be a few years away, but we have entered the Noisy Intermediate-Scale Quantum era which ranges from D-Wave commercial Quantum Annealers to a wide selection of gate-based quantum processor prototypes. These provide us with the opportunity to evaluate the potential of quantum computing for HEP applications.
    We will present early results from the DOE HEP.QPR...

    Go to contribution page
Building timetable...