10–15 Mar 2019
Steinmatte conference center
Europe/Zurich timezone

Contribution List

223 out of 223 displayed
Export to PDF
  1. Federico Carminati (CERN)
    11/03/2019, 09:00
    Plenary
  2. Monique Werlen (EPFL - Ecole Polytechnique Federale Lausanne (CH))
    11/03/2019, 09:10
    Track 1: Computing Technology for Physics Research
    Plenary
  3. Stanislav Poslavskii (Institute for High Energy Physics of NRC Kurchatov Institute (R)
    11/03/2019, 09:30
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    Computer algebra is one of a key tools in modern physics research. In this talk I will give an overview of the main mathematical and programming concepts that lie in the basis of modern computer algebra tools and how they are applied for solving modern theoretical physics and some engineering problems. I will also give a sketch overview of modern computer algebra software, including general...

    Go to contribution page
  4. Dr Oleg Kalashev (Institute for Nuclear Research RAS)
    11/03/2019, 10:00
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    The extremely low flux of ultra-high energy cosmic rays
    (UHECR) makes their direct observation by orbital experiments
    practically impossible. For this reason all current and planning UHECR
    experiments detect cosmic rays indirectly observing extensive air
    showers (EAS) initiated by cosmic ray particles in the atmosphere.
    Various types of shower observables are analysed in modern...

    Go to contribution page
  5. Daniel Ratner (Stanford University)
    11/03/2019, 11:00
    Track 2: Data Analysis - Algorithms and Tools
    Plenary

    X-ray Free Electron Lasers (XFELs) are among the most complex accelerator projects in the world today. With large parameter spaces, sensitive dependence on beam quality, huge data rates, and challenging machine protection, there are diverse opportunities to apply machine learning (ML) to XFEL operation. This talk will summarize promising ML methods and highlight recent examples of successful...

    Go to contribution page
  6. Bussmann Michael (HZDR, Dresden)
    11/03/2019, 11:30
    Track 1: Computing Technology for Physics Research
    Plenary
  7. Rajesh Ranganath (New York University)
    11/03/2019, 12:00
    Track 1: Computing Technology for Physics Research
    Plenary
  8. Stefano Carrazza (CERN)
    11/03/2019, 15:30
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    We present a novel general Boltzmann machine with continuous visible
    and discrete integer valued hidden states, yielding a parametric
    density function involving a ratio of Riemann-Theta functions. After a
    brief overview of the theory required to define this new ML
    architecture, we show how the conditional expectation of a hidden
    state for given visible states can be used as activation function...

    Go to contribution page
  9. Axel Naumann (CERN)
    11/03/2019, 15:30
    Track 1: Computing Technology for Physics Research
    Oral

    When it comes to number-crunching, C++ is at the core of HENP’s software. But while C++17 is old news, many of us did not get to use it yet. And why would we? This presentation introduces some of the main reasons to move to C++17 - focusing on performant, readable code and robust interfaces.

    Where C++17 has many new features that help, C++20 might come as “your next C++11”, a major step...

    Go to contribution page
  10. Dr Elise de Doncker (Western Michigan University)
    11/03/2019, 15:50
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    We report on multi-loop integral computations executed on a
    PEZY/Exascaler large-scale (immersion cooling) computing system.
    The programming model requires a host program written in C++
    with an OpenCL kernel. However the kernel can be generated by
    the Goose compiler interface, which allows parallelizing loops
    according to compiler directives. As an advantage, the executable
    derived from a...

    Go to contribution page
  11. Mario Masciovecchio (Univ. of California San Diego (US))
    11/03/2019, 15:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    In the High-Luminosity Large Hadron Collider (HL-LHC), one of the most challenging computational problems is expected to be finding and fitting charged-particle tracks during event reconstruction. The methods currently in use at the LHC are based on the Kalman filter. Such methods have shown to be robust and to provide good physics performance, both in the trigger and offline. In order to...

    Go to contribution page
  12. Marco Peruzzi (CERN)
    11/03/2019, 15:50
    Track 1: Computing Technology for Physics Research
    Oral

    NANOAOD is an event data format that has recently been commissioned by the CMS Collaboration to serve the needs of a substantial fraction of its physics analyses. The new format is about 20 times more compact than the MINIAOD format and only includes high level physics object information. NANOAOD is easily customisable for development activities, and supports standardised routines for content...

    Go to contribution page
  13. Stephan Hageboeck
    11/03/2019, 16:10
    Track 1: Computing Technology for Physics Research
    Oral

    PyROOT is the name of ROOT’s Python bindings, which allow to access all the ROOT functionality implemented in C++ from Python. Thanks to the ROOT type system and the Cling C++ interpreter, PyROOT creates Python proxies for C++ entities on the fly, thus avoiding to generate static bindings beforehand.

    PyROOT is in the process of being enhanced and modernised to meet the demands of the HEP...

    Go to contribution page
  14. Marko Petric (CERN)
    11/03/2019, 16:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    ConformalTracking is an open source library created in 2015 to serve as a detector independent solution for track reconstruction in detector development studies at CERN. Pattern recognition is one of the most CPU intensive tasks of event reconstruction at present and future experiments. Current tracking programs of the LHC experiments are mostly tightly linked to individual detector...

    Go to contribution page
  15. Dr Valentin Hirschi (ETHZ)
    11/03/2019, 16:10
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    The high-energy community recently witnessed the first attempts at leveraging machine (deep) learning techniques for improving the efficiency of the numerical Monte-Carlo integrations that lie at the core of most high-energy physics simulations.
    The first part of my talk will characterise the various type of integrations necessary in these simulations as well as the type of improvements that...

    Go to contribution page
  16. Dr Jean-Roch Vlimant (California Institute of Technology (US))
    11/03/2019, 16:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    To address the unprecedented scale of HL-LHC data, the HEP.TrkX project has been investigating a variety of machine learning approaches to particle track reconstruction. The most promising of these solutions, a graph neural network, processes the event as a graph that connects track measurements (detector hits corresponding to nodes) with candidate line segments between the hits (corresponding...

    Go to contribution page
  17. Dr Anne Cadiou (CNRS)
    11/03/2019, 16:30
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    Massively parallel simulations generate increasing volumes of large data, whose exploitation requires large storage resources, efficient network and increasingly large post-processing facilities. In the coming era of exascale computations, there is an emerging need for new data analysis and visualization strategies.

    Data manipulation, during the simulation and after, considerably slows down...

    Go to contribution page
  18. Lorenzo Moneta (CERN)
    11/03/2019, 16:30
    Track 1: Computing Technology for Physics Research
    Oral

    During the past two years ROOT's analysis tools underwent a major renovation,
    embracing a declarative approach.

    This contribution explores the most recent developments of the implementation of
    such approach, some real-life examples from LHC experiments as well as present
    and future R&D lines.

    After an introduction of the tool offering access to declarative analysis,
    RDataFrame, the newly...

    Go to contribution page
  19. Sebastian Skambraks (Max-Planck-Institut für Physik)
    11/03/2019, 16:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Machine learning methods are integrated into the pipelined first level track trigger of the upgraded flavor physics experiment Belle II in Tsukuba, Japan. The novel triggering techniques cope with the severe background conditions coming along with the upgrade of the instantaneous luminosity by a factor of 40 to $\mathcal{L} = 8 \times 10^{35} \text{cm}^{−2} \text{s}^{−1}$. Using the precise...

    Go to contribution page
  20. Iliana Betsou (National Technical Univ. of Athens (GR))
    11/03/2019, 16:50
    Track 1: Computing Technology for Physics Research
    Oral

    For two decades, ROOT brought its own graphics system abstraction based
    on a graphics model inspired by the popular graphics systems available
    at that time. (X11, OpenGL, Cocoa ...)

    With the emergence of modern C++ and recent graphics systems based on client/server
    models, it was time to redefined completely ROOT graphics.

    This has been been done in the context of ROOT 7 which provides the...

    Go to contribution page
  21. Dr Jean-Roch Vlimant (California Institute of Technology (US))
    11/03/2019, 17:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    With the upgrade of the LHC to high luminosity, an increased rate of collisions will place a higher computational burden on track reconstruction algorithms. Typical algorithms such as the Kalman Filter and Hough-like Transformation scale worse than quadratically. However, the energy function of a traditional method for tracking, the geometric Denby-Peterson (Hopfield) network method, can be...

    Go to contribution page
  22. Dr Alexander Nozik (INR RAS / MIPT)
    11/03/2019, 17:10
    Track 1: Computing Technology for Physics Research
    Oral

    Modern data processing (acquisition, storage and analysis) requires modern tools.
    One of the problems shared by existing scientific software is "scripting" approach, when user writes an imperative script which describes the stages in which data should be processed. The main deficiency of such approach is the lack of possibility to automate the process. For example one usually needs script to...

    Go to contribution page
  23. Zhenjing Cheng (INSTITUE OF HIGH ENERGY PHYSICS)
    11/03/2019, 18:00
    Track 1: Computing Technology for Physics Research
    Oral

    As a data-intensive computing application, high-energy physics requires storage and computing for large amounts of data at the PB level. Performance demands and data access imbalances in mass storage systems are increasing. Specifically, on one hand, traditional cheap disk storage systems have been unable to handle high IOPS demand services. On the other hand, a survey found that only a very...

    Go to contribution page
  24. Jennifer Ngadiuba (CERN)
    11/03/2019, 18:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Machine learning is becoming ubiquitous across HEP. There is great potential to improve trigger and DAQ performances with it. However, the exploration of such techniques within the field in low latency/power FPGAs has just begun. We present hls4ml, a user-friendly software, based on High-Level Synthesis (HLS), designed to deploy network architectures on FPGAs. As a case study, we use hls4ml...

    Go to contribution page
  25. Mr Jean-Philippe Guillet (LAPTh CNRS)
    11/03/2019, 18:00
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    A key ingredient in an automated evaluation of two-loop multileg processes is a
    fast and numerically stable evaluation of scalar Feynman integrals. In this respect, the calculation of two-loop three- and four-point functions in the general complex mass case so far relies on multidimensional numerical integration through sector decomposition whereby a reliable result has a high computing cost,...

    Go to contribution page
  26. Michael J. Morello (SNS and INFN-Pisa (IT)), Riccardo Cenci (Universita & INFN Pisa (IT)), Mr Andrea Di Luca (Universita degli Studi di Trento and INFN (IT)), Federico Lazzari (Universita & INFN Pisa (IT)), Giovanni Punzi (Universita & INFN Pisa (IT))
    11/03/2019, 18:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Finding tracks downstream of the magnet at the earliest LHCb trigger level is not part of the baseline plan of the Upgrade trigger, on account of the significant CPU time required to execute the search. Many long-lived particles, such as Ks and strange baryons, decay after the vertex track detector (VELO), so that their reconstruction efficiency is limited. We present a study of the...

    Go to contribution page
  27. Mirco Tracolli (INFN Perugia)
    11/03/2019, 18:20
    Track 1: Computing Technology for Physics Research
    Oral

    DODAS stands for Dynamic On Demand Analysis Service and is a Platform as a Service toolkit built around several EOSC-hub services designed to instantiate and configure on-demand container-based clusters over public or private Cloud resources. It automates the whole workflow from service provisioning to the configuration and setup of software applications. Therefore, such solution allows to use...

    Go to contribution page
  28. Michael David Sokoloff (University of Cincinnati (US))
    11/03/2019, 18:40
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    In the transition to Run 3 in 2021, LHCb will undergo a major luminosity upgrade, going from 1.1 to 5.6 expected visible Primary Vertices (PVs) per event, and will adopt a purely software trigger. This has fueled increased interest in alternative highly-parallel and GPU friendly algorithms for tracking and reconstruction. We will present a novel prototype algorithm for vertexing in the LHCb...

    Go to contribution page
  29. Mr Qi Xu (Institute of High Energy Physics, Chinese Academy of Sciences)
    11/03/2019, 18:40
    Track 1: Computing Technology for Physics Research
    Oral

    A large amount of data is produced by large scale scientific facilities in high energy physics (HEP) field. And distributed computing technologies has been widely used to process these data. In traditional computing model such as grid computing, computing job is usually scheduled to the sites where the input data was pre-staged in. This model will lead to some problems includ-ing low CPU...

    Go to contribution page
  30. Andrey Zarochentsev (St Petersburg State University (RU))
    11/03/2019, 19:00
    Track 1: Computing Technology for Physics Research
    Oral

    Storage have been identified as the main challenge for the future distributed computing infrastructures: Particle Physics (HL-LHC, DUNE, Belle-II), Astrophysics and Cosmology (SKA, LSST). In particular, the High Luminosity LHC (HL-LHC) will begin operations in the year of 2026 with expected data volumes to increase by at least an order of magnitude as compared with the present systems....

    Go to contribution page
  31. James Kahn (Karlsruhe Institute of Technology (KIT)), Martin Ritter
    11/03/2019, 19:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The Belle II experiment, beginning data taking with the full detector in early 2019, is expected to produce a volume of data fifty times that of its predecessor. With this dramatic increase in data comes the opportunity for studies of rare previously inaccessible processes. The investigation of such rare processes in a high data volume environment requires a correspondingly high volume of...

    Go to contribution page
  32. Atilim Gunes Baydin (University of Oxford)
    12/03/2019, 09:00
    Track 2: Data Analysis - Algorithms and Tools
    Plenary

    We present a novel framework that enables efficient probabilistic inference in large-scale scientific models by allowing the execution of existing domain-specific simulators as probabilistic programs, resulting in highly interpretable posterior inference. Our framework is general purpose and scalable, and is based on a cross-platform probabilistic execution protocol through which an inference...

    Go to contribution page
  33. Heather Gray (LBNL)
    12/03/2019, 09:40
    Track 2: Data Analysis - Algorithms and Tools
    Plenary
  34. Dr Frank K. Gürkaynak (ETH)
    12/03/2019, 10:10
    Track 1: Computing Technology for Physics Research
    Plenary

    Since 2013, ETH Zürich and University of Bologna have been working on the PULP project to develop energy efficient computing architectures suitable for a wide range of applications starting from the IoT domain where computations have to be done in a few milliWatts, all the way to the HPC domain where the goal is to extract the maximum number of calculations within a given power budget. For...

    Go to contribution page
  35. Danilo Rezende (DeepMind)
    12/03/2019, 11:00
    Track 2: Data Analysis - Algorithms and Tools
    Plenary
  36. Stephen Philip Jones
    12/03/2019, 11:30
    Track 1: Computing Technology for Physics Research
    Plenary
  37. Alexander Belyaev (Southampton University)
    12/03/2019, 12:00
    Plenary
  38. Prof. Andrey Arbuzov (Joint Institute for Nuclear Research (RU))
    12/03/2019, 15:30
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    Complete one-loop electroweak radiative corrections to polarized Bhabha
    scattering are presented. Higher order QED effects are evaluated in the leading
    logarithmic approximation. Numerical results are shown for the conditions of future
    circular and linear electron-positron colliders with polarized beams. Theoretical
    uncertainties are estimated.

    Go to contribution page
  39. Marco Cattaneo (CERN)
    12/03/2019, 15:30
    Track 1: Computing Technology for Physics Research
    Oral

    The LHCb Upgrade experiment will start operations in LHC Run 3 from 2021 onwards. Owing to the five-times higher instantaneous luminosity and higher foreseen trigger efficiency, the LHCb Upgrade will collect signal yields per unit time approximately ten times higher than that of the current experiment, with pileup increasing by a factor of six. This contribution presents the changes in the...

    Go to contribution page
  40. Dr Renat Sadykov (Joint Institute for Nuclear Research (RU))
    12/03/2019, 15:50
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    A new Monte Carlo event generator MCSANCee for simulation of processes at future e^+e^- colliders is presented. Complete one-loop electroweak radiative corrections and polarization of the initial beams are taken into account. The present generator includes the following processes: e^+e^- \to e^+e^- (mu^+mu^-, tau^+tau^-, ZH, Z\gamma, \gamma\gamma). Numerical results for all of these processes...

    Go to contribution page
  41. Niklas Nolte (CERN / Technische Universitaet Dortmund (DE))
    12/03/2019, 15:50
    Track 1: Computing Technology for Physics Research
    Oral

    The LHCb experiment will be upgraded for data taking in Run 3 and beyond and the instantaneous luminosity will in particular increase by a factor five. The lowest level trigger of the current experiment, a hardware-based trigger that has a hard limit of 1 MHz in its event output rate, will be removed. and replaced with a full software trigger. This new
    trigger needs to sustain rates up 30 MHz...

    Go to contribution page
  42. David Lange (Princeton University (US))
    12/03/2019, 16:10
    Track 1: Computing Technology for Physics Research
    Oral

    The HL-LHC program has seen numerous extrapolations of its needed computing resources that each indicate the need for substantial changes if the desired HL-LHC physics program is to be supported within the current level of computing resource budgets. Drivers include detector upgrades, large increases in event complexity (leading to increased processing time and analysis data size) and trigger...

    Go to contribution page
  43. Yoshimasa Kurihara (KEK)
    12/03/2019, 16:10
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    The Grace system is an automatic system to calculate cross sections based on the standard model and MSSM including one-loop corrections.
    I would like to report recent progress of the GRACE system including optimization of generated codes.

    Go to contribution page
  44. Thong Nguyen (California Institute of Technology (US))
    12/03/2019, 16:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Generative models, and in particular generative adversarial networks, are gaining momentum in hep as a possible way to speed up the event simulation process. Traditionally, gan models applied to hep are designed to return images. On the other hand, many applications (e.g., analyses based on particle flow) are designed to take as input lists of particles. We investigate the possibility of using...

    Go to contribution page
  45. Mr Alexandr Korobov (BINP,NSU)
    12/03/2019, 16:30
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    The Monte Carlo generator to simulate events of single-photon
    annihilation to hadrons at center-of-mass energies below 2.5 GeV
    is described. The generator is based on existing data on cross sections
    of various exclusive channels of e+e- annihilation obtained in various
    e+e- experiments by the scan and ISR methods. It is extensively used
    in the software packages for analysis of experiments at...

    Go to contribution page
  46. Vladislav Belavin (CERN)
    12/03/2019, 16:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    At this moment the most convenient approach in electromagnetic shower generation is Monte-Carlo simulation produced by software packages like GEANT4. However, one of the critical problems of Monte-Carlo production is that it is extremely slow since it involves simulation of numerous subatomic interactions.

    Recently, generative adversarial networks(GANs) addressed speed issue in the simulation...

    Go to contribution page
  47. Miguel Villaplana (Università degli Studi e INFN Milano (IT))
    12/03/2019, 16:30
    Track 1: Computing Technology for Physics Research
    Oral

    The ATLAS experiment produced so far hundreds of petabytes of data and expects to have one order of magnitude more in the future. This data are spread among hundreds of computing Grid sites around the world. The EventIndex is the complete catalogue of all ATLAS events, real and simulated, keeping the references to all permanent files that contain a given event in any processing stage. It...

    Go to contribution page
  48. Artem Maevskiy (National Research University Higher School of Economics (RU))
    12/03/2019, 16:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The increasing luminosities of future LHC runs and next generation of collider experiments will require an unprecedented amount of simulated events to be produced. Such large scale productions are extremely demanding in terms of computing resources. Thus new approaches to event generation and simulation of detector responses are needed. In LHCb the simulation of the RICH detector using the...

    Go to contribution page
  49. Oskar Wyszynski (CERN)
    12/03/2019, 16:50
    Track 1: Computing Technology for Physics Research
    Oral

    Network monitoring is of great importance for every data acquisition system (DAQ), it ensures stable and uninterrupted data flow. However, when using standard tools such as Icinga, often homogeneity of the DAQ hardware is not exploited.
    We will present the application of machine learning techniques to detect anomalies among network devices as well as connection instabilities. The former...

    Go to contribution page
  50. Philipp Do Nascimento Gaspar (Federal University of of Rio de Janeiro (BR))
    12/03/2019, 17:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    An extensive upgrade programme has been developed for LHC and its experiments, which is crucial to allow the complete exploitation of the extremely high-luminosity collision data. The programme is staggered in two phases, so that the main interventions are foreseen in Phase II.
    For this second phase, the main hadronic calorimeter of ATLAS (TileCal) will redesign its readout electronics but the...

    Go to contribution page
  51. Dr Torben Ferber (DESY)
    12/03/2019, 18:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The Belle II experiment at the SuperKEKB e+e- collider has completed its first-collisions run in 2018. The experiment is currently preparing for physics data taking in 2019. The electromagnetic calorimeter of the Belle II detector consists of 8,736 Thallium-doped CsI crystals with PIN-photodiode readout. Each crystal is equipped with waveform digitizers that allow the extraction of energy,...

    Go to contribution page
  52. Andrea Massironi (CERN)
    12/03/2019, 18:00
    Track 1: Computing Technology for Physics Research
    Oral

    The increasing LHC luminosity in Run III and, consequently, the increased number of simultaneous proton-proton collisions (pile-up) pose significant challenges for the CMS experiment. These challenges will affect not only the data taking conditions, but also the data processing environment of CMS, which requires an improvement in the online triggering system to match the required detector...

    Go to contribution page
  53. Conor Fitzpatrick (Ecole Polytechnique Federale de Lausanne (CH))
    12/03/2019, 18:20
    Track 1: Computing Technology for Physics Research
    Oral

    The first LHCb upgrade will take data at an instantaneous luminosity of 2E33cm^{-2}s^{-1} starting in 2021. Due to the high rate of beauty and charm signals LHCb has chosen as its baseline to read out the entire detector into a software trigger running on commodity x86 hardware at the LHC collision frequency of 30MHz, where a full offline-quality reconstruction will be performed. In this talk...

    Go to contribution page
  54. Dayane Gonçalves (Universidade Federal de Juiz de Fora)
    12/03/2019, 18:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The ATLAS experiment records data from the proton-proton collisions produced by the Large Hadron Collider (LHC). The Tile Calorimeter is the hadronic sampling calorimeter of ATLAS in the region |η| < 1.7. It uses iron absorbers and scintillators as active material. Jointly with the other calorimeters it is designed for reconstruction of hadrons, jets, tau-particles and missing transverse...

    Go to contribution page
  55. Mr Sergey Volkov
    12/03/2019, 18:20
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    A high-precision calculation of the electron anomalous magnetic moment requires an evaluation of QED Feynman diagrams up to five independent loops. To make this calculation practically feasible it is necessary to remove all infrared and ultraviolet divergences before integration. A procedure of removing both infrared and ultraviolet divergences in each individual Feynman diagram will be...

    Go to contribution page
  56. Lukas Alexander Heinrich (New York University (US))
    12/03/2019, 18:40
    Track 1: Computing Technology for Physics Research
    Oral

    The ATLAS software infrastructure has undergone several changes towards the adoption of Continuous Integration methodology to develop and test software. The users community can benefit from a CI environment in several ways: they can develop their custom analysis, build and test it using revision control services such as GitLab. By providing targeted official base images ATLAS enables users to...

    Go to contribution page
  57. Frederic Alexandre Dreyer (Oxford)
    12/03/2019, 18:40
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    We introduce a novel implementation of a reinforcement learning
    algorithm which is adapted to the problem of jet grooming, a
    crucial component of jet physics at hadron colliders. We show
    that the grooming policies trained using a Deep Q-Network model
    outperform state-of-the-art tools used at the LHC such as
    Recursive Soft Drop, allowing for improved resolution of the mass
    of boosted objects....

    Go to contribution page
  58. Jennifer Ngadiuba (CERN)
    12/03/2019, 19:00
    Track 1: Computing Technology for Physics Research
    Oral

    Resources required for high-throughput computing in large-scale particle physics experiments face challenging demands both now and in the future. The growing exploration of machine learning algorithms in particle physics offers new solutions to simulation, reconstruction, and analysis. These new machine learning solutions often lead to increased parallelization and faster reconstructions...

    Go to contribution page
  59. Yannik Alexander Rath (RWTH Aachen University (DE))
    12/03/2019, 19:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    A large part of the success of deep learning in computer science can be attributed to the introduction of dedicated architectures exploiting the underlying structure of a given task. As deep learning methods are adopted for high energy physics, increasing attention is thus directed towards the development of new models incorporating physical knowledge.

    In this talk, we present a network...

    Go to contribution page
  60. Leo Piilonen (Virginia Tech)
    12/03/2019, 19:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    I describe a novel interactive virtual reality visualization of the Belle II detector at KEK and the animation therein of GEANT4-simulated event histories. Belle2VR runs on Oculus and Vive headsets (as well as in a web browser and on 2D computer screens, in the absence of a headset). A user with some particle-physics knowledge manipulates a gamepad or hand controller(s) to interact with and...

    Go to contribution page
  61. Rene Brun (CERN)
    13/03/2019, 09:00
    Track 1: Computing Technology for Physics Research
    Plenary
  62. Paolo Calafiura (Lawrence Berkeley National Laboratory )
    13/03/2019, 09:40
    Plenary
  63. Sadaf Alam (Swiss Supercomputing Center (CSCS))
    13/03/2019, 10:10
    Track 1: Computing Technology for Physics Research
    Plenary
  64. Dr Isabelle D Cherney (Merrimack University)
    13/03/2019, 11:00
    Track 1: Computing Technology for Physics Research
    Plenary

    Women obtain more than half of U.S. undergraduate degrees in biology, chemistry, and mathematics, yet they earn less than 20% of computer science, engineering, and physics undergraduate degrees (NSF, 2014). Why are women represented in some STEM fields more than others? The STEM Paradox and the Gender Equality Paradox show that countries with greater gender equality have a lower percentage of...

    Go to contribution page
  65. 13/03/2019, 11:20
    Plenary
  66. Felix Schuermann (EPFL, Blue Brain project)
    13/03/2019, 12:00
    Plenary

    Modern electronic general-purpose computing has been on an unparalleled path of exponential acceleration for more than 7 decades. From the 1970 onwards, this trend was driven by the success of integrated circuits based on silicon technology. The exponential growth has become a self-fulfilling (and economically driven) prophecy commonly referred to as Moore’s Law. The end of Moore’s law has...

    Go to contribution page
  67. Thomas Alef (University of Bonn (DE))
    13/03/2019, 15:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Multivariate analyses in particle physics often reach a precision such that its uncertainties are dominated by systematic effects. While there are known strategies to mitigate systematic effects based on adversarial neural nets, the application of Boosted Decision Trees (BDT) so far had to ignore systematics in the training.
    We present a method to incorporate systematic uncertainties into a...

    Go to contribution page
  68. Dr Charles Leggett (Lawrence Berkeley National Lab (US))
    13/03/2019, 15:30
    Track 1: Computing Technology for Physics Research
    Oral

    The next generation of HPC and HTC facilities, such as Oak Ridge’s Summit, Lawrence Livermore’s Sierra, and NERSC's Perlmutter, show an increasing use of GPGPUs and other accelerators in order to achieve their high FLOP counts. This trend will only grow with exascale facilities such as A21. In general, High Energy Physics computing workflows have made little use of GPUs due to the relatively...

    Go to contribution page
  69. Ludovic Michel Scyboz (Max-Planck-Institut fur Physik (DE))
    13/03/2019, 15:30
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    While the Higgs boson couplings to other particles are increasingly well-measured by LHC experiments, it has proven difficult to set constraints on the Higgs trilinear self-coupling $\lambda$, principally due to the very low cross-section of Higgs boson pair production. We present the results of NLO QCD corrections to Higgs pair production with full top-quark mass dependence, where the...

    Go to contribution page
  70. Joshua Davies (Karlsruhe Institute of Technology)
    13/03/2019, 15:50
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    In this talk, we consider some of the computational aspects encountered in recent computations of double Higgs boson production in gluon fusion. We consider the NLO virtual amplitude in the high-energy limit, and the NNLO virtual amplitude in the low-energy (or large top quark mass) limit. We discuss various optimizations which were necessary to produce our results.

    Go to contribution page
  71. Mr Nikita Kazeev (Yandex School of Data Analysis (RU))
    13/03/2019, 15:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Analysis in high-energy physics usually deals with data samples populated from different sources. One of the most widely used ways to handle this is the sPlot technique. In this technique the results of a maximum likelihood fit are used to assign weights that can be used to disentangle signal from background. Some events are assigned negative weights, which makes it difficult to apply machine...

    Go to contribution page
  72. Giuseppe Avolio (CERN)
    13/03/2019, 15:50
    Track 1: Computing Technology for Physics Research
    Oral

    The ATLAS experiment at the Large Hadron Collider at CERN relies on a complex and highly distributed Trigger and Data Acquisition (TDAQ) system to gather and select particle collision data obtained at unprecedented energy and rates. The TDAQ Controls system is the component that guarantees the smooth and synchronous operations of all the TDAQ components and provides the means to minimize the...

    Go to contribution page
  73. Benjamin Fischer (Rheinisch Westfaelische Tech. Hoch. (DE))
    13/03/2019, 16:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Variable-dependent scale factors are commonly used in HEP to improve shape agreement of data and simulation. The choice of the underlying model is of great importance, but often requires a lot of manual tuning e.g. of bin sizes or fitted functions. This can be alleviated through the use of neural networks and their inherent powerful data modeling capabilities.
    We present a novel and...

    Go to contribution page
  74. Carlos Solans Sanchez (CERN)
    13/03/2019, 16:10
    Track 1: Computing Technology for Physics Research
    Oral

    The ATLAS experiment at the LHC at CERN will move to use the Front-End Link eXchange (FELIX) system in a staged approach for LHC Run 3 (2021) and LHC Run 4 (2026). FELIX will act as the interface between the data acquisition; detector control and TTC (Timing, Trigger and Control) systems; and new or updated trigger and detector front-end electronics.
    FELIX functions as a router between custom...

    Go to contribution page
  75. Dr Narayan Rana (INFN Milan)
    13/03/2019, 16:10
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    We present an algorithm which allows to solve analytically linear systems of differential equations which factorize to first order. The solution is given in terms of iterated integrals over an alphabet where its structure is implied by the coefficient matrix of the differential equations. These systems appear in a large variety of higher order calculations in perturbative Quantum Field...

    Go to contribution page
  76. Pablo de Castro (Universita e INFN, Padova (IT))
    13/03/2019, 16:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Complex computer simulations are commonly required for accurate data modelling in many scientific disciplines, including experimental High Energy Physics, making statistical inference challenging due to the intractability of the likelihood evaluation for the observed data. Furthermore, sometimes one is interested on inference drawn over a subset of the generative model parameters while taking...

    Go to contribution page
  77. Daniel Maitre
    13/03/2019, 16:30
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    In this contribution I will discuss the practicalities of storing events from a NNLO calculation on disk with the view of "replaying" the simulation for a different analysis and under different conditions, such as a different PDF fit or a different scale setting.

    Go to contribution page
  78. Misha Borodin (University of Iowa (US))
    13/03/2019, 16:30
    Track 1: Computing Technology for Physics Research
    Oral

    ATLAS production system called ProdSys2 is used during Run2 to define
    and to organize workflows and to schedule, submit and execute payloads
    in a distributed computing infrastructure. We design ProdSys2 to manage
    all ATLAS workflows: data (re)processing, MC simulation, physics groups
    analysis objects production, High Level Trigger processing, SW release
    building and user analysis. It...

    Go to contribution page
  79. Andreas Sogaard (University of Edinburgh (GB))
    13/03/2019, 16:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    A large number of physics processes as seen by ATLAS at the LHC manifest as collimated, hadronic sprays of particles known as ‘jets’. Jets originating from the hadronic decay of a massive particle are commonly used in searches for both measurements of the Standard Model and searches for new physics. The ATLAS experiment has employed machine learning discriminants to the challenging task of...

    Go to contribution page
  80. Simone Francescato (Sapienza Universita e INFN, Roma I (IT))
    13/03/2019, 16:50
    Track 1: Computing Technology for Physics Research
    Oral

    The Level-0 Muon Trigger system of the ATLAS experiment will undergo a full upgrade for HL-LHC to stand the challenging performances requested with the increasing instantaneous luminosity. The upgraded trigger system foresees to send RPC raw hit data to the off-detector trigger processors, where the trigger algorithms run on new generation of Field-Programmable Gate Arrays (FPGAs). The FPGA...

    Go to contribution page
  81. Andrii Verbytskyi (Max-Planck-Institut fur Physik (DE))
    13/03/2019, 16:50
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    We present the HepMC3 library designed to perform manipulations with
    event records of High Energy Physics Monte Carlo Event Generators
    (MCEGs). The library is a natural successor of HepMC and HepMC2
    libraries used in the present and in the past. HepMC3 supports all
    functionality of previous versions and significantly extends them.

    In comparison to the previous versions, the default event...

    Go to contribution page
  82. Dr Andrea Bocci (CERN)
    13/03/2019, 17:10
    Track 1: Computing Technology for Physics Research
    Oral

    The CMS experiment has been designed with a two-level trigger system: the Level 1 Trigger, implemented on custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a trade-off between the complexity of the algorithms running on the available computing resources,...

    Go to contribution page
  83. Dr steven prohira (The Ohio State University)
    13/03/2019, 17:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    In radio-based physics experiments, sensitive analysis techniques are often required to extract signals at or below the level of noise. For a recent experiment at the SLAC National Accelerator Laboratory to test a radar-based detection scheme for high energy neutrino cascades, such a sensitive analysis was employed to dig down into a spurious background and extract a signal. This analysis...

    Go to contribution page
  84. Collaboration CMS, Tommaso Boccali (INFN Sezione di Pisa, Universita' e Scuola Normale Superiore, P)
    13/03/2019, 18:00
    Track 1: Computing Technology for Physics Research
    Oral

    The next LHC Runs, nominally RunIII and RunIV, pose problems to the offline and computing systems in CMS. RunIV in particular will needs completely different solutions, given the current estimates of LHC conditions and Trigger estimates. We want to report on the R&D process CMS has a whole has established, in order to gain insight on the needs and the possible solutions for the 2020+ CMS computing.

    Go to contribution page
  85. Andy Buckley (University of Glasgow (GB))
    13/03/2019, 18:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The Gambit collaboration is a new effort in the world of global BSM fitting -- the combination of the largest possible set of observational data from across particle, astro, and nuclear physics to gain a synoptic view of what experimental data has to say about models of new physics. Using a newly constructed, open source code framework, Gambit have released several state-of-the-art scans of...

    Go to contribution page
  86. York Schröder (UBB Chillán)
    13/03/2019, 18:00
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    I briefly review the recently finished 5-loop renormalization program of QCD, and explain the status and prospects of the computer-algebraic techniques involved.

    Go to contribution page
  87. Igor Kondrashuk (UBB)
    13/03/2019, 18:20
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    We propose an algorithm to find a solution to an integro-differential equation of the DGLAP type for all the orders in the running coupling α with splitting functions given at a fixed order in α. Complex analysis is significantly used in the construction of the algorithm, we found a way to calculate the involved integrals over contours in the complex planes in more simple way than by any of...

    Go to contribution page
  88. Adrian Alan Pol (Université Paris-Saclay (FR))
    13/03/2019, 18:20
    Track 1: Computing Technology for Physics Research
    Oral

    Certifying the data recorded by the Compact Muon Solenoid (CMS) experiment at CERN which is usable for publication of physics results is a crucial and onerous task. Anomalies caused by detector malfunctioning or sub-optimal data processing are difficult to enumerate a priori and occur rarely, making it difficult to use classical supervised classification. We base out prototype towards the...

    Go to contribution page
  89. Mr Victor Estrade (LRI)
    13/03/2019, 18:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Data analysis based on forward simulation often require the use of a machine learning model for statistical inference of the parameters of interest.
    Most of the time these learned model are trained to discriminate events between backgrounds and signals to produce a 1D score, which is used to select a relatively pure signal region.
    The training of the model does not take into account the final...

    Go to contribution page
  90. Lukas Alexander Heinrich (New York University (US))
    13/03/2019, 18:40
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    A common goal in the search for new physics is the determination of sets of New Physics models, typically parametrized by a number of parameters such as masses or couplings, that are either compatible with the observed data or excluded by it, where the determination into which category a given model belong requires expensive computation of the expected signal. This problem may be abstracted...

    Go to contribution page
  91. Federico Lazzari (Universita & INFN Pisa (IT))
    13/03/2019, 18:40
    Track 1: Computing Technology for Physics Research
    Oral

    The hardware trigger L0 will be removed in LHCb upgrade I, and the software High Level Trigger have to process event at full LHC collision rate (30 MHz). This is a huge task, and delegating some low-level time-consuming tasks to FPGA accelerators can be very helpful in saving computing time that can be more usefully devoted to higher level tasks. In particular, the 2-D pixel geometry of the...

    Go to contribution page
  92. Dr William Lawrence Sutcliffe (Karlsruhe Institute of Technology (DE))
    13/03/2019, 19:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The Belle II experiment is an e+e- collider experiment in Japan, which
    begins its main physics run in early 2019. The clean environment of e+e-
    collisions together with the unique event topology of Belle II, in which
    an Υ(4S) particle is produced and subsequently decays to a pair of B
    mesons, allows a wide range of physics measurements to be performed
    which are difficult or impossible at...

    Go to contribution page
  93. Yuka Takahashi (University of Tokyo)
    13/03/2019, 19:00
    Track 1: Computing Technology for Physics Research
    Oral

    ROOT has several features which interact with libraries and require implicit header inclusion. This can be triggered by reading or writing data on disk, or user actions at the prompt. Often, the headers are immutable and reparsing is redundant. C++ Modules are designed to minimize the reparsing of the same header content by providing an efficient on-disk representation of C++ Code. ROOT has...

    Go to contribution page
  94. Johann Brehmer (New York University)
    14/03/2019, 09:00
    Plenary

    An important part of the LHC legacy will be precise limits on indirect effects of new physics, framed for instance in terms of an effective field theory. These measurements often involve many theory parameters and observables, which makes them challenging for traditional analysis methods. We discuss the underlying problem of “likelihood-free” inference and present powerful new analysis...

    Go to contribution page
  95. Gordon Watts (University of Seattle)
    14/03/2019, 09:40

    Abstract:
    The HEP software ecosystem faces new challenges in 2020 with the approach of the High Luminosity LHC (HL-LHC) and the turn-on of a number of large new experiments. Current software development is organized around the experiments: No other field has attained this level of self-organization and collaboration in software development.

    During 2017 the community produced a roadmap for...

    Go to contribution page
  96. Soumith Chintala (Facebook AI Research)
    14/03/2019, 10:10
    Track 2: Data Analysis - Algorithms and Tools
    Plenary
  97. Jean-Roch Vlimant (California Institute of Technology (US))
    14/03/2019, 11:00
    Track 2: Data Analysis - Algorithms and Tools
    Plenary

    The HL-LHC will see ATLAS and CMS see proton bunch collisions reaching track multiplicity up to 10.000 charged tracks per event. Algorithms need to be developed to harness the increased combinatorial complexity. To engage the Computer Science community to contribute new ideas, we organize a Tracking Machine Learning challenge (TrackML). Participants are provided events with 100k 3D points, and...

    Go to contribution page
  98. 14/03/2019, 11:20
    Track 2: Data Analysis - Algorithms and Tools
    Plenary
  99. Makiko Nio (Mishina Center, Riken)
    14/03/2019, 12:00
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Plenary

    The anomalous magnetic moment of the electron $a_e$ and that of the muon $a_\mu$ occupy the special positions for precision tests of the Standard Model of elementary particles. Both have been precisely measured, 0.24 ppb for $a_e$ and 0.5 ppm for $a_\mu$, and new experiments of both $a_e$ and $a_\mu$ are on-going aiming to reduce the uncertainties. Theoretical calculations of $a_e$ and $a_\mu$...

    Go to contribution page
  100. Marcel Rieger (Rheinisch-Westfaelische Tech. Hoch. (DE))
    14/03/2019, 15:30
    Track 1: Computing Technology for Physics Research
    Oral

    In particle physics, workflow management systems are primarily used as tailored solutions in dedicated areas such as Monte Carlo production. However, physicists performing data analyses are usually required to steer their individual workflows manually which is time-consuming and often leads to undocumented relations between particular workloads.
    We present the luigi analysis workflow (law)...

    Go to contribution page
  101. Sergey Shirobokov (Imperial College (GB))
    14/03/2019, 15:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    We investigate the problem of dark matter detection in emulsion detector. Previously we have shown, that it is very challenging but possible to use emulsion films of OPERA-like detector in SHiP experiment to separate electromagnetic showers from each other, thus hypothetically separating neutrino events from dark matter. In this study, we have investigated the possibility of usage of Target...

    Go to contribution page
  102. Dr Patrick Bos (Netherlands eScience Center / Nikhef)
    14/03/2019, 15:50
    Track 1: Computing Technology for Physics Research
    Oral

    RooFit is the statistical modeling and fitting package used in many big particle physics experiments to extract physical parameters from reduced particle collision data, e.g. the Higgs boson experiments at the LHC.
    RooFit aims to separate particle physics model building and fitting (the users' goals) from their technical implementation and optimization in the back-end.
    In this talk, we...

    Go to contribution page
  103. Mr Constantin Steppa (University of Potsdam)
    14/03/2019, 16:10
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Ground-based $\gamma$-ray astronomy relies on reconstructing primary particles' properties from the measurement of the induced air showers. Currently, template fitting is the state-of-the-art method to reconstruct air showers. CNNs represent promising means to improve on this method in both, accuracy and computational cost. Promoted by the availability of inexpensive hardware and open-source...

    Go to contribution page
  104. Daniel Hugo Campora Perez (Universidad de Sevilla (ES))
    14/03/2019, 16:10
    Track 1: Computing Technology for Physics Research
    Oral

    The LHCb detector will be upgraded in 2021, and due to the removal of the hardware-level trigger and the increase in the luminosity of the collisions, the conditions for a High Level Trigger 1 in software will become more challenging, requiring processing the full 30 MHz data-collision rate. The GPU High Level Trigger 1 is a framework that permits concurrent many-event execution targeting...

    Go to contribution page
  105. Thomas Hahn (MPI f. Physik)
    14/03/2019, 16:10
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    I give an update on recent developments in FeynArts, FormCalc, and LoopTools, and show how the new features were used in making the latest version of FeynHiggs.

    Go to contribution page
  106. Jonas Glombitza (RWTH Aachen)
    14/03/2019, 16:30
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    In recent years, the astroparticle physics community has successfully adapted supervised learning algorithms for a wide range of tasks, including event reconstruction in cosmic ray observatories[1], photon identification at Cherenkov telescopes[2], and the extraction of gravitational wave signals from time traces[3]. In addition, first unsupervised learning approaches of generative models at...

    Go to contribution page
  107. Noel Aaron Nottbeck (Johannes Gutenberg Universitaet Mainz (DE))
    14/03/2019, 16:30
    Track 1: Computing Technology for Physics Research
    Oral

    Artificial neural networks are becoming a standard tool for data analysis, but their potential remains yet to be widely used for hardware-level trigger applications. Nowadays, high-end FPGAs, as they are also often used in low-level hardware triggers, offer enough performance to allow for the inclusion of networks of considerable size into these system for the first time. Nevertheless, in the...

    Go to contribution page
  108. Dr Wolfgang Waltenberger (Austrian Academy of Sciences (AT))
    14/03/2019, 16:30
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    The software framework SModelS, which has already been presented at the ACAT 2016 conference, allows for a very fast confrontation of arbitrary BSM models exhibiting a Z2 symmetry with an ever growing database of simplified models results from CMS and ATLAS. In this talk we shall present its newest features, like the extension to include searches for heavy stable charged particles (HSCPs), or...

    Go to contribution page
  109. Takahiro Ueda (KEK)
    14/03/2019, 16:50
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    FORM is a symbolic manipulation system, which is especially advantageous for handling gigantic expressions with many small terms. Because FORM has been developed in tackling real problems in perturbative quantum field theory, it has some features useful in such problems, although FORM applications are not restricted to any specific research field. In this talk, we discuss recent developments...

    Go to contribution page
  110. Jim Pivarski (Princeton University)
    14/03/2019, 16:50
    Track 1: Computing Technology for Physics Research
    Oral

    Nested data structures are critical for particle physics: it would be impossible to represent collision data as events containing arbitrarily many particles in a rectangular table (without padding or truncation, or without relational indirection). These data structures are usually constructed as class objects and arbitrary length sequences, such as vectors in C++ and lists in Python, and data...

    Go to contribution page
  111. Laura Domine (Stanford University/SLAC)
    14/03/2019, 16:50
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    From a breakthrough revolution, Deep Learning (DL) has grown to become a de-facto standard technique in the fields of artificial intelligence and computer vision. In particular Convolutional Neural Networks (CNNs) are shown to be a powerful DL technique to extract physics features from images: They were successfully applied to the data reconstruction and analysis of Liquid Argon Time...

    Go to contribution page
  112. Soon Yung Jun (Fermi National Accelerator Lab. (US))
    14/03/2019, 17:10
    Track 1: Computing Technology for Physics Research
    Oral

    Efficient random number generation with high quality statistical properties and exact reproducibility of Monte Carlo simulation are important requirements in many areas of computational science. VecRNG is a package providing pseudo-random number generation (pRNG) in the context of a new library VecMath. This library bundles up several general-purpose mathematical utilities, data structures...

    Go to contribution page
  113. Gevy Cao (Queen's University)
    14/03/2019, 18:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    PICO is a dark matter experiment using superheated bubble chamber technology. One of the main analysis challenges in PICO is to unambiguously distinguish between background events and nuclear recoil events from possible WIMP scatters. The conventional discriminator, acoustic parameter (AP), utilizes frequency analysis in Fourier space to compute the acoustic power, which is proven to be...

    Go to contribution page
  114. Andrei Gheata (CERN)
    14/03/2019, 18:00
    Track 1: Computing Technology for Physics Research
    Oral

    Improving the computing performance of particle transport simulation is an important goal to address the challenges of HEP experiments in the coming decades (i.e. HL-LHC), as well as the needs of other fields (i.e. medical imaging and radiotherapy).
    The GeantV prototype includes a new transport engine, based on track level parallelization by grouping a large number of tracks in flight into...

    Go to contribution page
  115. Stanislav Poslavsky (IHEP, Protvino)
    14/03/2019, 18:00
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    The talk is devoted to the overview of Rings — an efficient lightweight library for commutative algebra written in Java and Scala languages. Polynomial arithmetic, GCDs, polynomial factorization and Gröbner bases are implemented with the use of modern asymptotically fast algorithms. Rings can be easily interacted or embedded in applications in high-energy physics and other research areas...

    Go to contribution page
  116. Sofia Vallecorsa (CERN)
    14/03/2019, 18:20
    Track 1: Computing Technology for Physics Research
    Oral

    Deep Learning techniques have are being studied for different applications by the HEP community: in this talk, we discuss the case of detector simulation. The need for simulated events, expected in the future for LHC experiments and their High Luminosity upgrades, is increasing dramatically and requires new fast simulation solutions. We will describe an R&D activity within CERN openlab, aimed...

    Go to contribution page
  117. Ben Ruijl (Nikhef)
    14/03/2019, 18:20
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    Over the last few years manipulating expressions with millions of terms has become common in particle physics. Form is the de facto tool for manipulations of extremely large expressions, but it comes with some downsides. In this talk I will discuss an effort to modernize aspects of Form, such as the language and workflow, and the introduction of bindings to C and Python. This new tool is...

    Go to contribution page
  118. Dennis Noll (Rheinisch Westfaelische Tech. Hoch. (DE))
    14/03/2019, 18:20
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Deep learning architectures in particle physics are often strongly dependent on the order of their input variables. We present a two-stage deep learning architecture consisting of a network for sorting input objects and a subsequent network for data analysis. The sorting network (agent) is trained through reinforcement learning using feedback from the analysis network (environment). A tree...

    Go to contribution page
  119. David Lawrence (Jefferson Lab)
    14/03/2019, 18:40
    Track 1: Computing Technology for Physics Research
    Oral

    JANA2 is multi-threaded event reconstruction framework being developed for Experimental Nuclear Physics. It is an LDRD funded project that will be the successor of the original JANA framework. JANA2 is a near complete rewrite emphasizing C++ language features that have only become available since the C++11 standard. Successful and less-than-successful strategies employed in JANA and how they...

    Go to contribution page
  120. Dr Michelangelo Preti (Ecole Normale Supérieure - CNRS)
    14/03/2019, 18:40
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    The Mathematica package STR (Star-Triangle Relations) is a recently developed tool designed to solve Feynman diagrams by means of the method of uniqueness in any (Euclidean) spacetime dimension D. The method of uniqueness is a powerful technique to solve multi-loop Feynman integrals in theories with conformal symmetry imposing some relations between D and the powers of propagators. In our...

    Go to contribution page
  121. Artem Ryzhikov (Yandex School of Data Analysis (RU))
    14/03/2019, 18:40
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Accurate particle identification (PID) is one of the most important aspects of the LHCb experiment. Modern machine learning techniques such as deep neural networks are efficiently applied to this problem and are integrated into the LHCb software. In this research, we discuss novel applications of neural network speed-up techniques to achieve faster PID in LHC upgrade conditions. We show that...

    Go to contribution page
  122. Michael Poat (Brookhaven National Laboratory), Jefferson Porter, Jan Balewski
    14/03/2019, 19:00
    Track 1: Computing Technology for Physics Research
    Oral

    The Solenoidal Tracker at RHIC (STAR) is a multi-national supported experiment located at Brookhaven National Lab. The raw physics data captured from the detector is on the order of tens of PBytes per data acquisition campaign, which makes STAR fit well within the definition of a big data science experiment. The production of the data has typically run on standard nodes or on standard Grid...

    Go to contribution page
  123. Olmo Cerri (California Institute of Technology (US))
    14/03/2019, 19:00
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Using variational autoencoders trained on known physics processes, we develop a one-side p-value test to isolate previously unseen event topologies as outlier events. Since the autoencoder training does not depend on any specific new physics signature, the proposed procedure has a weak dependence on underlying assumptions about the nature of new physics. An event selection based on this...

    Go to contribution page
  124. Francois Lanusse (Lawrence Berkeley National Laboratory )
    15/03/2019, 09:00
    Track 2: Data Analysis - Algorithms and Tools
    Plenary
  125. Andy Buckley (University of Glasgow (GB))
    15/03/2019, 09:30
    Plenary
  126. Kazuhiro Terao (SLAC)
    15/03/2019, 10:00
    Track 2: Data Analysis - Algorithms and Tools
    Plenary
  127. Dorothea Vom Bruch (LPNHE Paris, CNRS)
    15/03/2019, 11:00
    Track 1: Computing Technology for Physics Research
    Plenary

    Beginning in 2021, the upgraded LHCb experiment will use a triggerless readout system collecting data at an event rate of 30 MHz. A software-only High Level Trigger will enable unprecedented flexibility for trigger selections. During the first stage (HLT1), a sub-set of the full offline track reconstruction for charged particles is run to select particles of interest based on single or...

    Go to contribution page
  128. Luis Miguel Garcia Martin (Univ. of Valencia and CSIC (ES))
    15/03/2019, 11:10
    Track 2: Data Analysis - Algorithms and Tools
    Plenary

    The LHCb experiment is dedicated to the study of the c- and b-hadrons decays, including long living particles such as Ks and strange baryons (Lambda, Xi, etc... ). These kind of particles are difficult to reconstruct from LHCb tracking systems since they escape the detection in the first tracker. A new method to evaluate the performance in terms of efficiency and throughput of the different...

    Go to contribution page
  129. Patricia Mendez Lorenzo (CERN)
    15/03/2019, 11:20
    Track 1: Computing Technology for Physics Research
    Plenary
  130. Andrey Ustyuzhanin (Yandex School of Data Analysis (RU))
    15/03/2019, 11:35
    Track 2: Data Analysis - Algorithms and Tools
    Plenary
  131. Ben Ruijl (Nikhef)
    15/03/2019, 11:50
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Plenary
  132. Gordon Watts (University of Washington (US))
    15/03/2019, 12:05
    Track 1: Computing Technology for Physics Research
    Plenary
  133. Federico Carminati (CERN)
    15/03/2019, 12:20
    Track 1: Computing Technology for Physics Research
  134. Mrs Li Liao (Institute of Applied Physics and Computational Mathematics,Beijing)
    Track 1: Computing Technology for Physics Research
    Poster

    The scientific computing community is suffering from a lack of good development tool that can handle well the unique problems of coding for high performance computing. It is much more difficult for domain experts to parallelize inherited serial codes written in FORTRAN which are very common in CSE research field. An automatic parallel programming IDE is developed for rapid development of...

    Go to contribution page
  135. Riccardo Farinelli (Universita e INFN, Ferrara (IT))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Triple-GEM detectors are a well known technology in high energy physics. In order to have a complete understanding of their behavior, in parallel with on-beam testing, a Monte Carlo code has to be developed to simulate their response to the passage of particles. The software must take into account all the physical processes involved from the primary ionization up to the signal formation,...

    Go to contribution page
  136. Maria Grigoryeva (Institute for Theoretical and Experimental Physics (RU))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The ATLAS experiment at the LHC has a complex heterogeneous distributed
    computing infrastructure, which is used to process and analyse exabytes of data. Metadata are collected and stored at all stages of physics analysis and data processing. All metadata could be divided into operational metadata to be used for the quasi on-line monitoring, and archival to study the systems’ behaviour over a...

    Go to contribution page
  137. Prof. Ilkay Turk Cakir (Giresun University)
    Track 1: Computing Technology for Physics Research
    Poster

    A Tier-3g Facility within the computing resources of Istanbul Aydin
    University has been planned and installed in collaboration with TR-ULAKBIM national
    Tier-2 center. The facility is intended to provide an upgraded data analysis
    infrastructure to CERN researchers considering the recent nation-wide projects of ATLAS and
    CMS experiments. The fundamental design of Tier-3g has been detailed in...

    Go to contribution page
  138. Gordon Watts (University of Washington (US))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    MATHUSLA has been proposed as a second detector that sits over 100m from an LHC interaction point, on the surface, to look for ultra long-lived particles. A test stand was constructed with 2 layers of scintillator paddles and 3 layers of RPC's, on loan from the DZERO and Argo-YBJ. Downward and upward going tracks from cosmics and muons from the interaction point have been reconstructed. To...

    Go to contribution page
  139. Werner Spolidoro Freund (Federal University of of Rio de Janeiro (BR))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The ATLAS experiment implemented an ensemble of neural networks
    (NeuralRinger algorithm) dedicated to improve the performance of
    filtering events containing electrons in the high-input rate online
    environment of the Large Hadron Collider at CERN, Geneva.
    This algorithm has been used online to select electrons with transverse energies
    above 15 GeV since 2017 and is extended to electrons...

    Go to contribution page
  140. Pier Paolo Ricci (INFN CNAF)
    Track 1: Computing Technology for Physics Research
    Poster

    During the last years we have carried out a renewal of the Building Management System (BMS) software of our data center with the aim of improving the data collection capability. Considering the complex physical distribution of the technical plants and the limits of the actual building hosting our center, a system that simply monitors and collects all the necessary information and provides...

    Go to contribution page
  141. Wen Guan (University of Wisconsin (US))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    We will present our experiences and preliminary studies on LHC high
    energy physics data analysis with quantum simulators and IBM quantum
    computer hardware using IBM Qiskit. The performance is compared with the
    results using a classical machine learning method applied to a physics
    process in Higgs-coupling-to–two-top-quarks as an example. This work is a
    collaboration between University of...

    Go to contribution page
  142. Minh Duc Nguyen (Lomonosov Moscow State University Skobeltsyn Institute of Nuclear Physics)
    Track 1: Computing Technology for Physics Research
    Poster

    CernVM-FS is a solution to scalable, reliable and low-maintenance software distribution that is widely used in various High Energy Physics collaborations. The information that can be distributed by CernVM-FS is not limited to software but any other data. By default, the whole CernVM-FS repository containing all subdirectories and files is available to all users in read-only mode after...

    Go to contribution page
  143. Rafal Bielski (CERN)
    Track 1: Computing Technology for Physics Research
    Poster

    Athena is the software framework used in the ATLAS experiment throughout the data processing path, from the software trigger system through offline event reconstruction to physics analysis. The shift from high-power single-core CPUs to multi-core systems in the computing market means that the throughput capabilities of the framework have become limited by the available memory per process. For...

    Go to contribution page
  144. Yana Zhezher (INR RAS & Lomonosov MSU)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The Telescope Array experiment, located in Utah, USA, is aimed to the ultra-high-energy cosmic rays study with the detection of the extensive air showers (EAS). The surface detector of the Telescope Array provides multivariate data reconstructed from the waveforms of signals of the detectors which took part in a particular event. Moreover, a number of variables are composition-sensitive and...

    Go to contribution page
  145. Christoph Heidecker (KIT - Karlsruhe Institute of Technology (DE))
    Track 1: Computing Technology for Physics Research
    Poster

    Data-intensive end-user analyses in High Energy Physics requires high data throughput to reach short turnaround cycles.
    This leads to enormous challenges for storage and network infrastructure, especially when facing the tremendously increasing amount of data to be processed during High-Luminosity LHC runs.
    Including opportunistic resources with volatile storage systems into the traditional...

    Go to contribution page
  146. Pier Paolo Ricci (INFN CNAF)
    Track 1: Computing Technology for Physics Research
    Poster

    The INFN CNAF Tier-1 Long Term Data Preservation (LTDP) project was established at the end of 2012 in close collaboration with Fermi National Accelerator Laboratory (FNAL) with the purpose of saving, distributing and maintaining over time the CDF Tevatron analysis framework and all the relevant scientific data produced by the experiment activity. During recent years, a complete copy of all CDF...

    Go to contribution page
  147. Simone Mosciatti (Politecnico di Milano (IT))
    Track 1: Computing Technology for Physics Research
    Poster

    Linux containers have gained widespread use in High Energy Physics, be it for services using container engines such as containerd/kubernetes, for production jobs using container engines such as Singularity or Shifter, or for development workflows using Docker as a local container engine. Thus the efficient distribution of the container images, whose size usually ranges from a few hundred...

    Go to contribution page
  148. Dr Alexander Kryukov (SINP MSU)
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    Tensor calculations are an important case in many natural sciences like mathematics and physics. To simplify such expressions, computer algebra is widely used. There are a number of approaches for solving this problem, namely, the component calculations, the calculations when tensor is considered as an abstract symbol with indices possessing some symmetry properties, and finally a pure...

    Go to contribution page
  149. Andrei Kazarov (NRC Kurchatov Institute PNPI (RU)), Alina Corso Radu (University of California Irvine (US))
    Track 1: Computing Technology for Physics Research
    Poster

    The ATLAS experiment at the Large Hadron Collider (LHC) operated successfully from 2008 to 2018, which included Run 1 (2008-2013), a shutdown period and the Run 2 (2016-2018). In the course of the Run 2, the ATLAS data taking achieved an overall data taking efficiency of 97%, largely constraint by the irreducible dead-time introduced to accommodate the limitations of the detector read-out...

    Go to contribution page
  150. Gokhan Unel (University of California Irvine (US))
    Track 1: Computing Technology for Physics Research
    Poster

    Nowadays, any physicist performing an analysis of the LHC data, needs to be well-versed in programming, at the level of both a system programmer and a software developer to handle the vast amounts of collision and simulation events. Even the simplest programming mistake in any of these areas can create big confusions on the analysis results. Moreover, a multitude of different analysis...

    Go to contribution page
  151. Mr Barthelemy Von Haller (CERN)
    Track 1: Computing Technology for Physics Research
    Poster

    The ALICE experiment at the CERN LHC focuses on studying the quark-gluon plasma produced by heavy-ion collisions. After the Long Shutdown 2 in 2019-2020, the ALICE Experiment will see its data input throughput increase a hundredfold, up to 3.4 TB/s. In order to cope with such a large amount of data, a new online-offline computing system, called O2, will be deployed. By reconstructing the data...

    Go to contribution page
  152. Andre Sailer (CERN), Frank-Dieter Gaede (Deutsches Elektronen-Synchrotron (DE)), Marko Petric (CERN), Markus Frank (CERN)
    Track 1: Computing Technology for Physics Research
    Poster

    The detector description is an essential component in simulation, reconstruction and analysis of data resulting from particle collisions in high energy physics experiments. The main motivation behind DD4hep is to provide an integrated solution for all these stages and addresses detector description in a broad sense, including the geometry and the materials used in the device, and additional...

    Go to contribution page
  153. Prof. Alexander Belyaev (University of Southampton & Rutherford Appleton Laboratory)

    Decoding the nature of Dark Matter (DM) is one of the most important problems of particle physics. DM can potentially provide unique signatures at collider and non-collider experiments. Details of these signatures which we expect to observe in the near future would allow us to delineate the properties of DM and the respective underlying theory Beyond the Standard Model (BSM). While there...

    Go to contribution page
  154. Aishik Ghosh (Centre National de la Recherche Scientifique (FR))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The extensive physics program of the ATLAS experiment at the Large Hadron Collider (LHC) relies on large scale and high fidelity simulation of the detector response to particle interactions. Current full simulation techniques using Geant4 provide accurate modeling of the underlying physics processes, but are inherently resource intensive. In light of the high-luminosity upgrade of the LHC and...

    Go to contribution page
  155. Ms Yao Zhang (Institute of High Energy Physics, China), Prof. Ye Yuan (Institute of High Energy Physics, China)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Drift chamber is the main tracking detector for high energy physics experiment like BESIII. Due to the high luminosity and high beam intensity, drift chamber is suffer from the background from the beam and electronics which represent a computing challenge to the reconstruction software. Deep learning developments in the last few years have shown tremendous improvements in the analysis of data...

    Go to contribution page
  156. Artem Golovatiuk (University of Naples (IT))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The NEWSdm (Nuclear Emulsions for WIMP Search directional measure) is an underground Direct detection Dark Matter (DM) search experiment. The usage of recent developments in the nuclear emulsions allows probing new regions in the WIMP parameter space. The prominent feature of this experiment is a potential of recording the signal direction, which gives a chance of overcoming the "neutrino...

    Go to contribution page
  157. Wahid Bhimji (Lawrence Berkeley National Lab. (US))
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    We present recent work in deep learning for particle physics and cosmology at NERSC, the US Dept. of Energy mission HPC centre. We will describe activity in new methods and applications; distributed training across HPC resources; and plans for accelerated hardware for deep learning in NERSC-9 (Perlmutter) and beyond.
    Some of the HEP methods and applications showcased include conditional...

    Go to contribution page
  158. Sydney Otten (Radboud Universiteit Nijmegen)
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    The recent years have shown an exciting development in the scientific commmunity due to the interplay between new methods from data science and artificial intelligence, increasing computational resources and physics. The fundamental object of our theories of nature is the Lagrangian whose form is determined by the symmetries found already. A famous and well-motivated extension of the SM...

    Go to contribution page
  159. Dr Linghui Wu (Institute of High Energy Physics, CAS)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The inner drift chamber of the BESIII experiment is encountering an aging problem after running of several years. A Cylindrical Gas Electron Multiplier Inner Tracker (CGEM-IT) has been an important candidate for the upgrade of the inner drift chamber. In order to understand the specific detection behavior of CGEM-IT and to build a digitization model for it, a detailed simulation study with the...

    Go to contribution page
  160. Dr Johan Bregeon
    Track 1: Computing Technology for Physics Research
    Poster

    Scientific user communities are widely using computing and storage resources provided by large grid infrastructures. More and more capacity is provided by these infrastructures in a form of cloud resources. Cloud resources are much more flexible for usage but provide completely different access interfaces. Furthermore, grid infrastructure users are often getting access to extra computing...

    Go to contribution page
  161. Ben Couturier (CERN), Christophe Haen (CERN), Marko Petric (CERN)
    Track 1: Computing Technology for Physics Research
    Poster

    All grid middleware require external packages to interact with computing elements, storage sites… In the case of the DIRAC middleware this was historically divided into two bundles, one called externals containing Python and standard binary libraries and the other called the LCGBundle containig libraries form the grid world (gfal, arc, etc). The externals were provided for several platforms...

    Go to contribution page
  162. Dirk Krücker (Deutsches Elektronen-Synchrotron (DE))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    We introduce two new loss functions designed to directly optimise the statistical significance of the expected number of signal events when training neural networks to classify events as signal or background in the scenario of a search for new physics at a particle collider. The loss functions are designed to directly maximise commonly used estimates of the statistical significance, s/√(s+b),...

    Go to contribution page
  163. Benedikt Hegner (Brookhaven National Laboratory (US))
    Track 1: Computing Technology for Physics Research
    Poster

    In 2019 Belle II will start the planned physics runs with the entire detector installed. Compared to current collider experiments at the LHC, where all critical services are provided by the CERN as host lab and only storage and CPU resources are provided externally, Belle II and KEK chose a different, more distributed strategy. In particular, it provides easier access to existing expertise and...

    Go to contribution page
  164. Alina Corso Radu (University of California Irvine (US))
    Track 1: Computing Technology for Physics Research
    Poster

    Information concerning the operation, configuration and behaviour of the ATLAS experiment need to be reported, gathered and shared reliably with the whole ATLAS community which comprises over three thousand scientists geographically distributed all over the world. To provide such functionality, a logbook facility, Electronic Logbook for the information storage of ATLAS (ELisA), has been...

    Go to contribution page
  165. Matei Vasile (IFIN-HH (RO))
    Track 1: Computing Technology for Physics Research
    Poster

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment the Large Hadron Collider (LHC) at CERN currently is composed of a large number of distributed hardware and software components (about 3000 machines and more than 25000 applications) which, in a coordinated manner, provide the data-taking functionality of the overall system.
    During data taking runs, a huge flow of...

    Go to contribution page
  166. Oksana Shadura (University of Nebraska Lincoln (US))
    Track 1: Computing Technology for Physics Research
    Poster

    ROOT is a large code base with a complex set of build-time dependencies; there is a significant difference in compilation time between the “core” of ROOT and the full-fledged deployment. We present results on a “delayed build” for internal ROOT packages and external packages. This gives the ability to offer a “lightweight” core of ROOT, later extended by building additional modules to extend...

    Go to contribution page
  167. Yongsheng Gao (California State University (US))
    Track 1: Computing Technology for Physics Research
    Poster

    Maintaining the huge computing grid facilities for LHC
    experiments and replacing their hardware every few years has been very
    expensive. The California State University (CSU) ATLAS group just
    received $250,000 AWS cloud credit from the CSU Chancellor’s Office to
    build the first virtual US ATLAS Tier 3 to explore cloud solutions for
    ATLAS. We will use this award to set up full ATLAS...

    Go to contribution page
  168. Mr Yavar Taheri Yeganeh (Shahid Beheshti University)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Deep learning has shown a promising future in physics’ data analysis and is anticipated to revolutionize LHC discoveries.
    Designing an optimal algorithm may seem to be the most challenging task in machine learning progress especially in HEP due to the high dimensionality and extreme complexity of the data.
    Physical knowledge can be employed in designing and modifying of the algorithm’s...

    Go to contribution page
  169. Ralf Florian Von Cube (KIT - Karlsruhe Institute of Technology (DE))
    Track 1: Computing Technology for Physics Research
    Poster

    The German CMS community (DCMS) as a whole can benefit from the various compute resources, available to its different institutes. While Grid-enabled and National Analysis Facility resources are usually shared within the community, local and recently enabled opportunistic resources like HPC centers and cloud resources are not. Furthermore, there is no shared submission infrastructure...

    Go to contribution page
  170. Konstantin Malanchev
    Track 1: Computing Technology for Physics Research
    Poster

    We present an open source GPU-accelerated cross-platform FITS 2D image viewer FIPS. Unlike other FITS viewers, FIPS uses GPU hardware via OpenGL to provide functionality such as zooming, panning and level adjustments. FIPS is the first end-to-end GPU FITS image viewer: FITS image data is fully offloaded to GPU memory as is, and then processed by OpenGL shaders.

    The executables and the source...

    Go to contribution page
  171. Dr Andrei Kataev (Institute for Nuclear Research of RAS, Moscow )
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    The set of the four-loop numerical results for the relation between pole and running heavy quarks masses in QCD at fixed number of lighter flavors $3\leq n_l\leq 15$, which was obtained in Ref.[1] with help of the Lomonosov Supercomputer of Moscow State University, is analysed by the ordinary method of the least squares. We use its variant which allows to solve the overdetermined system of 13...

    Go to contribution page
  172. Soon Yung Jun (Fermi National Accelerator Lab. (US))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The Geant4 toolkit is used extensively in high energy physics to simulate the
    passage of particles through matter and to estimate effects such as detector
    responses, efficiencies and smearing. Geant4 uses many underlying models to predict
    particle interaction kinematics, and uncertainty in these models leads to uncertainty in the interpretation of experiment measurements. The Geant4...

    Go to contribution page
  173. Jakub Trusina (Czech Technical University (CZ))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    In High Energy Physics, tests of homogeneity are used primarily in two cases: for verification that data sample does not differ significantly from numerically produced Monte Carlo sample and for verifying separation of signal from background. Since Monte Carlo samples are usually weighted, it is necessary to modify classical homogeneity tests in order to apply them to weighted samples. In...

    Go to contribution page
  174. Riccardo Farinelli (Universita e INFN, Ferrara (IT))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    MPGD are the new frontier in gas trackers. Among this kind of
    devices, the GEM chambers are widely used. The experimental signals acquired with the detector must obviously be reconstructed and analysed. In this
    contribution, a new offline software to perform reconstruction,
    alignment and analysis on the data collected with APV-25 and TIGER ASICs will
    be presented. GRAAL (Gem Reconstruction And...

    Go to contribution page
  175. Alessandra Forti (University of Manchester (GB))
    Track 1: Computing Technology for Physics Research
    Poster

    In recent years the usage of machine learning techniques within data-intensive sciences in general and high-energy physics in particular has rapidly increased, in part due to the availability of large datasets on which such algorithms can be trained as well as suitable hardware, such as graphics or tensor processing units which greatly accelerate the training and execution of such algorithms....

    Go to contribution page
  176. Matthias Jochen Schnepf (KIT - Karlsruhe Institute of Technology (DE))
    Track 1: Computing Technology for Physics Research
    Poster

    The ever growing amount of HEP data to be analyzed in the future requires as of today the allocation of additional, potentially only temporary available non-HEP dedicated resources. These so-called opportunistic resources are also well-suited to cover the typical unpredictable peak demands for computing resources in end-user analyses. However, their temporary availability requires a dynamic...

    Go to contribution page
  177. Oliver Majersky (Comenius University (SK))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Hadronic decays of vector bosons and top quarks are increasingly important to the ATLAS physics program, both in measurements of the standard model and searches for new physics. At high energies, these decays are collimated into a single overlapping region of energy deposits in the detector, referred to as a jet. However, vector boson and top quarks are hidden under an enormous background of...

    Go to contribution page
  178. Mr Zhanchen Wei (IHEP, CAS)
    Track 1: Computing Technology for Physics Research
    Poster

    The traditional partial wave analysis (PWA) algorithm is designed to process data serially which requires a large amount of memory that may exceed the memory capacity of one single node to store runtime data. It is quite necessary to parallelize this algorithm in a distributed data computing framework to improve its performance. Within an existing production-level Hadoop cluster, we implement...

    Go to contribution page
  179. Hao Cai (Wuhan University (CN))
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    Amplitude analysis is an important tool for the research of the hadron spectrum, in which the maximum likelihood method is used to estimate the parameters of a probability density function. In each optimization step, the likelihood values of a huge number of events from both data and Monte-Carlo simulations are calculated and summed, which is the most time-consuming part of the whole...

    Go to contribution page
  180. Dr Martin Ritter (LMU / Cluster Universe)
    Track 1: Computing Technology for Physics Research
    Poster

    The Belle II experiment at the SuperKEKB e+e- collider has completed its first-collisions run in 2018. The experiment is currently preparing for physics data taking in 2019. With many scientists now preparing their analysis, the user friendliness of the Belle II software framework is of great importance.

    Jupyter Notebooks allow for mixed code, documentation, and output like plots in a easy to...

    Go to contribution page
  181. Andrey Zarochentsev (St Petersburg State University (RU))
    Track 1: Computing Technology for Physics Research
    Poster

    The poster focuses on our experience in usage and extending of JupyterLab
    in combination with EOS and CVMFS for HEP analysis within a local university group.
    We started with a copy of CERN SWAN environment, after that our project evolved independently.
    A major difference is that we switched from classic Jupyter Notebook
    to JupyterLab, because our users are more insterested in text editor...

    Go to contribution page
  182. Alexander Nozik (INR RAS / MIPT)
    Track 1: Computing Technology for Physics Research
    Poster

    One of the problems of scientific software development is lack of proper language tools to do it conveniently. Among the modern languages only few are able (have flexibility and most importantly libraries) to handle scientific tasks: C++, Python and Java. Also in some cases some niche languages like C# or Julia could be used.

    The major problem of C++ is the complexity of the language and...

    Go to contribution page
  183. Katharina Mueller (Universitaet Zuerich (CH))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    We present a new approach to identification of boosted neutral particles using electromagnetic calorimeters of the LHCb detector. The identification of photons and neutral pions is currently based on expected properties of the objects reconstructed in the calorimeter. This allows to distinguish single photons in the electromagnetic calorimeter from overlapping photons produced from high...

    Go to contribution page
  184. Katharina Mueller (Universitaet Zuerich (CH)), Mr Nikita Kazeev (Yandex School of Data Analysis (RU))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Particle identification is a key ingredient of most of LHCb results. Muon identification in particular is used at every stage of the LHCb triggers. The objective of the muon identification is to distinguish muons from the rest of the particles using only information from the Muon subdetector under strict timing constraints. We use state-of-the-art gradient boosting algorithm and real data with...

    Go to contribution page
  185. Konstantin Malanchev
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The next generation of astronomical surveys will revolutionize our understanding of the Universe, raising unprecedented data challenges in the process. One of them is the impossibility to rely on human scanning for the identification of unusual/unpredicted astrophysical objects. Moreover, given that most of the available data will be in the form of photometric observations, such...

    Go to contribution page
  186. Stephan Hageboeck (University of Bonn (DE))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    RooFit and RooStats, the toolkits for statistical modelling in ROOT, are used in most searches and measurements at the Large Hadron Collider. The data to be collected in Run 3 will enable measurements with higher precision and models with larger complexity, but also require faster data processing.

    In this talk, first results on vectorising and multi-threading likelihood fits in RooFit will be...

    Go to contribution page
  187. Mantas Stankevicius (Vilnius University, Institute of Data Science and Digital Technologies (LT))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The Compact Muon Solenoid (CMS) is one of the general-purpose detectors at the CERN Large Hadron Collider (LHC) which collects enormous amounts of physics data. Before the final physics analysis can proceed, data has to be checked for quality (certified) by passing a number of automatic (like physics objects reconstruction, histogram preparation) and manual (checking, comparison and decision...

    Go to contribution page
  188. Serguei Bityukov (Institute for High Energy Physics (RU))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    In this work are presented the result of the comparing two versions of GEANT4 by the use of experimental data of experiment HARP. The comparison is performed with help of a new method of statistical comparison of data sets. The method provides more information for data analysis than methods based on the chi-squared distribution.

    Go to contribution page
  189. Prof. Alexander Pukhov (Skobeltsyn institute of Nuclear Physics)
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    micrOMEGAs is a package for the calculation of the relic density of Dark Matter and of different observables related with Dark Matter searches. The talk will present the general structure of the package and several recent developments including freeze-in relic abundance calculation, interface with different packages that compute collider observables, and recent improvements in direct detection signals.

    Go to contribution page
  190. Adrian Alan Pol (Université Paris-Saclay (FR))
    Track 1: Computing Technology for Physics Research
    Poster

    Real time monitoring of Compact Muon Solenoid (CMS) trigger system is a vital task to ensure the quality of all physics results published by the collaboration. Today, the trigger monitoring software reports on potential problems given the time evolution of the reported rates. The anomalous rates are identified given the deviation from the prediction which is calculated using a regression model...

    Go to contribution page
  191. Giuseppe Avolio (CERN)
    Track 1: Computing Technology for Physics Research
    Poster

    ATLAS is one of the generic-purpose experiments observing hadron
    collisions at the LHC at CERN. Its trigger and data acquisition system
    (TDAQ) is responsible for selecting and transporting interesting
    physics events from the detector to permanent storage where the data
    are used for physics analysis. The transient storage of ATLAS TDAQ is
    the last component of the online data-flow system. It...

    Go to contribution page
  192. Domenico Giordano (CERN)
    Track 1: Computing Technology for Physics Research
    Poster

    HEPSPEC-06 is a decade old suite used to benchmark CPU resources for WLCG.
    Its adoption spans from the hardware vendors, to the site managers, funding agencies and software experts.
    It is stable, reproducible, accurate, however it is reaching the end of its life.
    Initial hints of lack of correlations with the HEP applications have been collected.
    Looking for suitable alternatives the HEPiX...

    Go to contribution page
  193. Nectarios Benekos (University of Peloponnese (GR))
    Track 1: Computing Technology for Physics Research
    Poster

    Data Quality Monitoring (DQM) is a very significant component of all high-
    energy physics (HEP) experiments. Data recorded by Data Acquisition (DAQ) sensors and devices are sampled to perform live monitoring of the status of each detector during data collection. This gives to the system and scientists the ability to identify problems with extremely low latency, minimizing the amount of data...

    Go to contribution page
  194. Dr Hong Guo (Institute of Applied Physics and Computational Mathematics)
    Track 1: Computing Technology for Physics Research
    Oral

    The overlapping grid technique can be used to solve partial differential equations defined on complex computational domains. However, large-scale realistic applications using overlapping grid technique under distributed memory systems are not easy. The grid points do not meet point by point and interpolation is needed. Applications with millions of grid points may consist of many blocks. A...

    Go to contribution page
  195. Lu Wang (Computing Center,Institute of High Energy Physics, CAS)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    HEP computing is a typical data intensive computing. Performance of distributed storage system, can largely defines the efficiency of HEP data processing and analysis. There is a large number of parameters that can be adjusted in a distributed storage system. The setting of these parameters has a great influence on the performance. At present, these parameters are either set with static values...

    Go to contribution page
  196. Giuseppe Avolio (CERN)
    Track 1: Computing Technology for Physics Research
    Poster

    Over the next few years, the LHC will prepare for the upcoming High-Luminosity upgrade
    in which it is expected to deliver ten times more p-p collisions. This will create a harsher
    radiation environment and higher detector occupancy. In this context, the ATLAS
    experiment, one of the general purpose experiments at the LHC, plans substantial upgrades
    to the detectors and to the trigger system in...

    Go to contribution page
  197. Essma Redouane Salah (University of M'sila Algeria)
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    An efficient phase space integration is important for most calculations for collider experiments. We are developing a phase space integration that distribute phase space points according to the singular limit of QCD, Using the Altarelli-Parisi splitting functions as the underlying probability for a splitting, by developing and applying theoretical and computational tools

    Go to contribution page
  198. Sydney Otten (Radboud Universiteit Nijmegen)
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Although the standard model of particle physics is successful in describing physics as we know it, it is known to be incomplete. Many models have been developed to extend the standard model, none of which have been experimentally verified. One of the main hurdles in this effort is the dimensionality of these models, yielding problems in analysing, visualising and communicating results. Because...

    Go to contribution page
  199. Dorothea Vom Bruch (LPNHE Paris, CNRS)
    Track 1: Computing Technology for Physics Research
    Poster

    Beginning in 2021, the upgraded LHCb experiment will use a triggerless readout system collecting data at an event rate of 30 MHz. A software-only High Level Trigger will enable unprecedented flexibility for trigger selections. During the first stage (HLT1), a sub-set of the full offline track reconstruction for charged particles is run to select particles of interest based on single or...

    Go to contribution page
  200. Olmo Cerri (California Institute of Technology (US))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Mitigation of the effect of the multiple parasitic proton collisions produced during bunch crossing at the LHC is a major endeavor towards the realization of the physics program at the collider. The pileup affects many physics observable derived during the online and offline reconstruction. We propose a graph neural network machine learning model, based on the PUPPI approach, for identifying...

    Go to contribution page
  201. Matej Srebre (LMU Munich)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The pixel vertex detector is an essential part of the Belle II experiment, allowing us to determinate the location of particle trajectories and decay vertices. The combined data from the innermost Pixel Vertex Detector (PXD), followed by the Silicon Vertex Detector (SVD), and the outermost Central Drift Chamber (CDC) are crucial in the event reconstruction phase to determine particle types,...

    Go to contribution page
  202. Antoni Shtipliyski (Imperial College (GB))
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    The High-Luminosity upgrade of LHC (HL-LHC) is expected to deliver a total luminosity of 3000 fb$^{-1}$ to the general purpose experiments. This will allow the measurement of Standard Model processes with unprecedented precision, and will significantly increase the reach of searches for new physics. Higher data rates and increased radiation levels will require substantial upgrades to the...

    Go to contribution page
  203. Etienne Lyard (University of Geneva)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The Cherenkov Telescope Array (CTA) will be the largest ground-based, gamma-ray observatory. CTA will detect the signature of gamma rays and cosmic rays hadrons and electrons interacting with the earth atmosphere. Making the best possible use of this facility requires to be able to separate events generated by gamma rays from the particle-induced background. Deep neural networks produced...

    Go to contribution page
  204. Ms Yao Zhang (Institiute of High Energy Physics, China), Prof. Ye Yuan (Institue of High Energy Physics, China)
    Track 1: Computing Technology for Physics Research
    Poster

    The mass Monte Carlo data production is the most CPU intensive process in the data analysis of for the high energy physics. The use of large scale computational resources at HPC in China is expected to increase substantially the cost-efficiency of the processing. TianheII, the second fastest HPC in China, which used to ranks first in the TOP500. We report on the technical challenges and...

    Go to contribution page
  205. Mr Petr Bouř (FNSPE CTU in Prague)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Event reconstruction for NOvA experiment is a critical step preceding further data analysis. We describe the complex NOvA reconstruction pipeline (containing several unsupervised learning techniques) with focus on the specific step of so-called "prong matching". In this step, we are combining 2D prongs (projections of particle trajectories) into 3D prong objects. In order to find the best...

    Go to contribution page
  206. Lukas Alexander Heinrich (New York University (US))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    A large class of statistical models in high energy physics can be expressed a simultaneous measurement of binned observables. A popular framework for such binned analysis is HistFactory. So far the only implementation of the model has been within the ROOT ecosystem, limiting adoption and extensibility. We present a complete and extensible implementation of the HistFactory class of models in...

    Go to contribution page
  207. Oksana Shadura (University of Nebraska Lincoln (US))
    Track 1: Computing Technology for Physics Research
    Poster

    The LHC’s Run3 will push the envelope on data-intensive workflows and, at the lowest level, this data is managed using the ROOT software framework. At the beginning of Run 1, all data was compressed with the ZLIB algorithm: ROOT has since added support for multiple new algorithms (such as LZMA and LZ4), each with unique strengths. Work is continuing as industry introduces new techniques -...

    Go to contribution page
  208. Alexander Hagen (PNNL)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Measurements in Liquid Argon TPC (LArTPC) neutrino detectors, such as the MicroBooNE detector at Fermilab, feature large, high fidelity event images. Deep learning techniques have been extremely successful in classification tasks of photographs, but their application to LArTPC event images is challenging, due to the large size of the events. Events in these detectors are typically two orders...

    Go to contribution page
  209. Victoria Tokareva (Karlsruhe Institute of Technology (KIT))
    Track 1: Computing Technology for Physics Research
    Poster

    Increasing data rates opens up new opportunities for astroparticle physics by improving the precision of data analysis and by deploying advanced analysis techniques that demand relatively large data volumes, e.g. deep learning . One of the ways to increase statistics is to combine data from different experimental setups for joint analysis. Moreover, such data integration provides us with an...

    Go to contribution page
  210. Oksana Shadura (University of Nebraska Lincoln (US))
    Track 1: Computing Technology for Physics Research
    Poster

    Distinct HEP workflows have distinct I/O needs; while ROOT I/O excels at serializing complex C++ objects common to reconstruction, analysis workflows typically have simpler objects and can sustain higher event rates. To meet these workflows, we have developed a “bulk I/O” interface, allowing multiple events’ data to be returned per library call. This reduces ROOT-related overheads and...

    Go to contribution page
  211. Dr ZIYAN DENG (Institute of High Energy Physics, Beijing, China)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    BESIII experiment studies physics in the tau-charm energy region. Since 2009, BESIII has collected large scale data samples and many important physics results have been achieved based on these samples. Gaudi is used as BESIII offline software underlying framework, for both data production and data analysis. As data set accumulated year by year, efficiency of data analysis becomes more and more...

    Go to contribution page
  212. Olmo Cerri (California Institute of Technology (US))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    We show how Interaction Networks could be used for jet tagging at the Large Hadron Collider.
    We take as an example the problem of identifying high-pT H->bb decays exploiting both jet substructure and secondary vertices from b quarks. We consider all tracks produced in the hadronization of the two b’s and represent the jet both as a track-to-track and a track-to-vertex interaction. The...

    Go to contribution page
  213. Dr Stefano Laporta (University of Padova)
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Oral

    We describe a little in detail some techniques used in the calculation of
    4-loop QED contributions to some quantities like g-2,
    slope of the Dirac form factor, renormalization constants;
    in particular, some different approaches to the parallelization of some
    parts of the calculations. Some recent results will be also presented.

    Go to contribution page
  214. Doris Yangsoo Kim (Soongsil University)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The SuperKEKB collider and the Belle II experiment have finished the second phase of their runs in 2018, which was an essential step to study the e+ e- beam collisions and prepare for the third phase of the runs. The third phase starts at the beginning of 2019, and it is planned to collected a data sample of 50/ab during the following decade.
    The simulation library of the Belle II experiment...

    Go to contribution page
  215. Nicola Louise Abraham (University of Sussex (GB))
    Track 1: Computing Technology for Physics Research
    Poster

    The design and performance of the ATLAS Inner Detector (ID) trigger algorithms running online on the high level trigger (HLT) processor farm for 13 TeV LHC collision data with high pile-up are discussed.

    The HLT ID tracking is a vital component in all physics signatures in the ATLAS trigger for the precise selection of the rare or interesting events necessary for physics analysis...

    Go to contribution page
  216. Mr Konstantin Pugachev (Budker Institute of Nuclear Physics (RU))
    Track 1: Computing Technology for Physics Research
    Poster

    The SND detector has been operating at the VEPP-2000 collider (BINP, Russia) for several years unveiling amazing knowledge. Being a scientific facility it experiences constant improvements. One of the improvements worth mentioning is the DQM system for the SND detector.

    First, information is collected automatically by DQM scripts and then could be corrected/confirmed by the detector operators...

    Go to contribution page
  217. Heather Gray (LBNL)
    Track 2: Data Analysis - Algorithms and Tools
    Oral

    Universal Quantum Computing may still be a few years away, but we have entered the Noisy Intermediate-Scale Quantum era which ranges from D-Wave commercial Quantum Annealers to a wide selection of gate-based quantum processor prototypes. These provide us with the opportunity to evaluate the potential of quantum computing for HEP applications.
    We will present early results from the DOE HEP.QPR...

    Go to contribution page
  218. Leo Piilonen (Virginia Tech)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    I describe the charged-track extrapolation and muon-identification modules in the Belle II data-analysis code framework (basf2). These modules use GEANT4E to extrapolate reconstructed charged tracks outward from the Belle II Central Drift Chamber into the outer particle-identification detectors, the electromagnetic calorimeter, and the K-long and muon detector (KLM). These modules propagate...

    Go to contribution page
  219. Fabian Klimpel (Max-Planck-Institut fur Physik (DE))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Track finding and fitting are amongst the most complex part of
    event reconstruction in high-energy physics, and dominates usually
    the computing time in high luminosity environment. A central part of
    track reconstruction is the transport of a given track parameterisation
    (i.e. the parameter estimation and associated covariances) through the
    detector, respecting the magnetic field setup and the...

    Go to contribution page
  220. Luis Miguel Garcia Martin (Univ. of Valencia and CSIC (ES))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The LHCb experiment is dedicated to the study of the c- and b-hadrons decays, including long living particles such as Ks and strange baryons (Lambda, Xi, etc... ). These kind of particles are difficult to reconstruct from LHCb tracking systems since they escape the detection in the first tracker. A new method to evaluate the performance in terms of efficiency and throughput of the different...

    Go to contribution page
  221. Gareth Douglas Roy (University of Glasgow (GB))
    Track 1: Computing Technology for Physics Research
    Poster

    In software development Continuous Integration (CI), the practice of bringing together multiple developers’ code modifications into a single repository, and Continuous Delivery (CD), the practice of automatedly creating and testing releases are well known. CI/CD pipelines are available in many automation tools (such as GitLab) and act to enhance and speed up software development.

    Continuous...

    Go to contribution page
  222. Mr Oscar Roberto Chaparro Amaro (Instituto Politécnico Nacional. Centro de Investigación en Compuutación.)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Probability distribution functions (PDFs) are very used in modeling random
    processes and physics simulations. It can be demonstrated that the relation-
    ships between PDFs are linked through functional parameters. Improving the
    performance of the generation of many random numbers to be used as input
    by the PDFs is often a very challenging task as it involves algorithms with
    acceptance-rejection...

    Go to contribution page