Conveners
Plenary: Opening
- Federico Carminati (CERN)
Plenary: Plenary
- Fons Rademakers (CERN)
Plenary
- Maria Girone (CERN)
Plenary
- Andrey Arbuzov (Joint Institute for Nuclear Research (JINR))
Plenary
- David Britton (University of Glasgow (GB))
Plenary
- Jerome LAURET (Brookhaven National Laboratory)
Plenary
- Gudrun Heinrich (Max Planck Institute for Physics)
Plenary
- Gang CHEN (INSTITUTE OF HIGH ENERGY PHYSICS)
Plenary
- Gordon Watts (University of Washington (US))
Plenary
- Monique Werlen (EPFL - Ecole Polytechnique Federale Lausanne (CH))
Computer algebra is one of a key tools in modern physics research. In this talk I will give an overview of the main mathematical and programming concepts that lie in the basis of modern computer algebra tools and how they are applied for solving modern theoretical physics and some engineering problems. I will also give a sketch overview of modern computer algebra software, including general...
The extremely low flux of ultra-high energy cosmic rays
(UHECR) makes their direct observation by orbital experiments
practically impossible. For this reason all current and planning UHECR
experiments detect cosmic rays indirectly observing extensive air
showers (EAS) initiated by cosmic ray particles in the atmosphere.
Various types of shower observables are analysed in modern...
X-ray Free Electron Lasers (XFELs) are among the most complex accelerator projects in the world today. With large parameter spaces, sensitive dependence on beam quality, huge data rates, and challenging machine protection, there are diverse opportunities to apply machine learning (ML) to XFEL operation. This talk will summarize promising ML methods and highlight recent examples of successful...
We present a novel framework that enables efficient probabilistic inference in large-scale scientific models by allowing the execution of existing domain-specific simulators as probabilistic programs, resulting in highly interpretable posterior inference. Our framework is general purpose and scalable, and is based on a cross-platform probabilistic execution protocol through which an inference...
Since 2013, ETH Zรผrich and University of Bologna have been working on the PULP project to develop energy efficient computing architectures suitable for a wide range of applications starting from the IoT domain where computations have to be done in a few milliWatts, all the way to the HPC domain where the goal is to extract the maximum number of calculations within a given power budget. For...
Women obtain more than half of U.S. undergraduate degrees in biology, chemistry, and mathematics, yet they earn less than 20% of computer science, engineering, and physics undergraduate degrees (NSF, 2014). Why are women represented in some STEM fields more than others? The STEM Paradox and the Gender Equality Paradox show that countries with greater gender equality have a lower percentage of...
Modern electronic general-purpose computing has been on an unparalleled path of exponential acceleration for more than 7 decades. From the 1970 onwards, this trend was driven by the success of integrated circuits based on silicon technology. The exponential growth has become a self-fulfilling (and economically driven) prophecy commonly referred to as Mooreโs Law. The end of Mooreโs law has...
An important part of the LHC legacy will be precise limits on indirect effects of new physics, framed for instance in terms of an effective field theory. These measurements often involve many theory parameters and observables, which makes them challenging for traditional analysis methods. We discuss the underlying problem of โlikelihood-freeโ inference and present powerful new analysis...
Abstract:
The HEP software ecosystem faces new challenges in 2020 with the approach of the High Luminosity LHC (HL-LHC) and the turn-on of a number of large new experiments. Current software development is organized around the experiments: No other field has attained this level of self-organization and collaboration in software development.
During 2017 the community produced a roadmap for...
The HL-LHC will see ATLAS and CMS see proton bunch collisions reaching track multiplicity up to 10.000 charged tracks per event. Algorithms need to be developed to harness the increased combinatorial complexity. To engage the Computer Science community to contribute new ideas, we organize a Tracking Machine Learning challenge (TrackML). Participants are provided events with 100k 3D points, and...
The anomalous magnetic moment of the electron $a_e$ and that of the muon $a_\mu$ occupy the special positions for precision tests of the Standard Model of elementary particles. Both have been precisely measured, 0.24 ppb for $a_e$ and 0.5 ppm for $a_\mu$, and new experiments of both $a_e$ and $a_\mu$ are on-going aiming to reduce the uncertainties. Theoretical calculations of $a_e$ and $a_\mu$...
Beginning in 2021, the upgraded LHCb experiment will use a triggerless readout system collecting data at an event rate of 30 MHz. A software-only High Level Trigger will enable unprecedented flexibility for trigger selections. During the first stage (HLT1), a sub-set of the full offline track reconstruction for charged particles is run to select particles of interest based on single or...
The LHCb experiment is dedicated to the study of the c- and b-hadrons decays, including long living particles such as Ks and strange baryons (Lambda, Xi, etc... ). These kind of particles are difficult to reconstruct from LHCb tracking systems since they escape the detection in the first tracker. A new method to evaluate the performance in terms of efficiency and throughput of the different...