Conveners
Plenary: Opening
- Gordon Watts (University of Washington (US))
Plenary: Plenary Session
- Denis Perret-Gallix (Centre National de la Recherche Scientifique (FR))
Plenary: Plenary Session
- Toby Burnett (University of Washington)
Plenary: Plenary Session
- Niko Neufeld (CERN)
Plenary: Plenary Session
- Daniel Maitre
Plenary: Plenary Session
- Sergei Gleyzer (University of Florida (US))
Plenary: Plenary Session
- Maria Girone (CERN)
Plenary: Plenary Session
- Pushpalatha Bhat (Fermi National Accelerator Lab. (US))
Plenary: Plenary Session
- David Britton (University of Glasgow (GB))
Plenary: Plenary Session
- Maria Girone (CERN)
Plenary: Track Summaries and Conclusions
- Federico Carminati (CERN)
Plenary: Registration
- Gordon Watts (University of Washington (US))
Symbolic computation is an indispensable tool for theoretical particle
physics, especially in the context of perturbative quantum field
theory. In this talk, I will review FORM, one of computer algebra
systems widely used in higher-order calculations, its design principles
and advantages. The newly released version 4.2 will also be discussed.
Modern machine learning (ML) has introduced a new and powerful toolkit to High Energy Physics. While only a small number of these techniques are currently used in practice, research and development centered around modern ML has exploded over the last year(s). I will highlight recent advances with a focus on jet physics to be concrete. Themselves defined by unsupervised learning algorithms,...
We start the discussion by summarizing recent and consolidated
applications of ML in TH-HEP. We then focus our discussion on recent studies about parton distribution functions determination and related tools based on machine learning algorithms and strategies. We conclude by showing future theoretical applications of ML to Monte Carlo codes.
Can we evolve the C++ language itself to make C++ programming both more powerful and simpler, and if so, how? The only way to accomplish both of those goals at the same time is by adding abstractions that let programmers directly express their intentโto elevate comments and documentation to testable code, and elevate coding patterns and idioms into compiler-checkable declarations.
This talk...
The reconstruction of particle trajectories in the tracking detectors is one of the most complex parts in analysing the data at hadron colliders. Maximum luminosity is typically achieved at the cost of a large number of simultaneous proton-proton interactions between beam crossing. The large number of particles produced in such interactions introduces challenges both in terms of maintaining...
Machine Learning techniques have been used in different applications by the HEP community: in this talk, we discuss the case of detector simulation. The need for simulated events, expected in the future for LHC experiments and their High Luminosity upgrades, is increasing dramatically and requires new fast simulation solutions.We will present results of several studies on the application of...
In this talk, I will give a quick overview of physics results and computational methods in lattice QCD. Then I will outline some of the physics challenges, especially those of interest to particle physicists. Last, I will speculate on how machine-learning ideas could be applied to accelerate lattice-QCD algorithms.
This presentation will share details about the Intel Nervana Deep Learning Platform and how a data scientist can use it to develop solutions for deep learning problems. The Intel Nervana DL Platform is a full-stack platform including hardware and software tools that enable data scientists to build high-accuracy deep learning solutions quickly and cost effectively than with alternative...
The emergence of Cloud Computing has resulted in an explosive growth of computing power, where even moderately-sized datacenters rival the worldโs most powerful supercomputers in raw compute capacity.
Microsoftโs Catapult project has augmented its datacenters with FPGAs (Field Programmable Gate Arrays), which not only expand the compute capacity and efficiency for scientific computing, but...
The round table will be animated by the following panelists
Kyle Cranmer
Wahid Bhijmi
Michela Paganini
Andrey Ustyuzhanin
Sergei Gleyzer
Weโve known for a while now that projections of computing needs for the experiments running in 10 years from now are unaffordable. Over the past year the HSF has convened a series of workshops aiming to find consensus on the needs, and produce proposals for research and development to address this challenge. At this time many of the software related drafts are far enough along to give a...
Simply preserving the data from a scientific experiment is rarely sufficient to enable the re-use or re-analysis of the data. Instead, a more complete set of knowledge describing how the results were obtained, including analysis software and workflows, computation environments, and other documentation may be required. This talk explores the challenges in preserving the various knowledge...
Research has shown that diversity enhances creativity. It encourages the search for novel information and perspectives leading to better decision making and problem solving, and leads to unfettered discoveries and breakthrough innovations. Even simply being exposed to diversity can change the way you think.
Professional development opportunities are needed to train faculty and staff to improve...
Our panel will cover the topics of "How to create/hire diversity into teams and the competitive advantage of diverse teams".
We would like to collect questions you may have in advance so panelists have time to prepare comprehensive answers. We will collect them until Wednesday 23rd, noon. The form for this is at...