29 November 2021 to 3 December 2021
Virtual and IBS Science Culture Center, Daejeon, South Korea
Asia/Seoul timezone

Session

Posters: Walnut

1 Dec 2021, 19:00
Walnut (Gather.Town)

Walnut

Gather.Town

Presentation materials

There are no materials yet.

  1. Kilian Lieret
    Track 1: Computing Technology for Physics Research
    Poster

    The physics output of modern experimental HEP collaborations hinges not only on the quality of its software but also on the ability of the collaborators to make the best possible use of it.

    With the COVID-19 pandemic making in-person training impossible, the training paradigm at Belle II was shifted towards one of guided self-study.

    To that end, the study material was rebuilt from...

    Go to contribution page
  2. Bogdan Kutsenko (Budker Institute of Nuclear Physics (RU))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The study of the conversion decay of the omega meson into $\pi^{0}e^{+} e^{-} $ state was performed with the CMD-3 detector at the VEPP-2000 electron-positron collider in Novosibirsk. The main physical background to the process under study is radiative decay $\omega \to \pi^{0} \gamma$, where monochromatic photon converts on the material in front of the detector. The deep neural network was...

    Go to contribution page
  3. Artem Uskov (Budker Institute of Nuclear Physics)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Analysis of the CMD-3 detector data: searching for low-energy electron-positron annihilation into $KK\pi$ and $KK\pi\pi^0$

    A. A. Uskov.
    Budker Institute of Nuclear Physics, Siberian Branch of the Russian Academy of Sciences.

    We explored the process $e^+e^- → KK\pi$ with the СMD-3 detector at the electron-positron collider VEPP-2000. The data amassed by the СMD-3 detector in the...

    Go to contribution page
  4. Patrick Reichherzer (Ruhr-University Bochum)
    Track 3: Computations in Theoretical Physics: Techniques and Methods
    Poster

    In astrophysics, the search for sources of the highest-energy cosmic rays continues. For further progress, not only ever better observatories but also ever more realistic numerical simulations are needed. We present here a novel approach to charged particle propagation that finds its application in Simulations of particle propagation in jets of active galactic nuclei, possible sources of...

    Go to contribution page
  5. Dr Marco Letizia (MaLGa, University of Genoa and INFN - National Institute for Nuclear Physics)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Kernel methods represent an elegant and mathematically sound approach to nonparametric learning, but so far could hardly be used in large scale problems, since naïve implementations scale poorly with data size. Recent improvements have shown the benefits of a number of algorithmic ideas, combining optimization, numerical linear algebra and random projections. These, combined with (multi-)GPU...

    Go to contribution page
  6. Artem Maevskiy (National Research University Higher School of Economics (RU))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The detailed detector simulation models are vital for the successful operation of modern high-energy physics experiments. In most cases, such detailed models require a significant amount of computing resources to run. Often this may not be afforded and less resource-intensive approaches are desired. In this work, we demonstrate the applicability of Generative Adversarial Networks (GAN) as the...

    Go to contribution page
  7. Manfred Peter Fackeldey (Rheinisch Westfaelische Tech. Hoch. (DE))
    Track 1: Computing Technology for Physics Research
    Poster

    Fast turnaround times for LHC physics analyses are essential for scientific success. The ability to quickly perform optimizations and consolidation studies is critical. At the same time, computing demands and complexities are rising with the upcoming data taking periods and new technologies, such as deep learning.
    We present a show-case of the HH->bbWW analysis at the CMS experiment, where we...

    Go to contribution page
  8. Kaushal Gumpula (Fermi National Accelerator Lab. (US)), Mr Nikita Koloskov (University of Chicago), Jeremy Edmund Hewes (University of Cincinnati (US))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The Exa.TrkX project presents a graph neural network (GNN) technique for low-level reconstruction of neutrino interactions in a Liquid Argon Time Projection Chamber (LArTPC). GNNs are still a relatively novel technique, and have shown great promise for similar reconstruction tasks in the LHC. Graphs describing particle interactions are formed by treating each detector hit as a node, with edges...

    Go to contribution page
  9. David Southwick (CERN)
    Track 1: Computing Technology for Physics Research
    Poster

    As part of CERN-GEANT-PRACE-SKA collaboration and in the context of EGI-ACE (Advanced Computing for the European Open Science Cloud ) collaborators are working towards enabling
    efficient HPC use for Big Data sciences. Approaching HPC site with High Throughput
    Computing (HTC) workloads presents unique challenges in areas concerning data
    ingress/egress, use of shared storage systems, and...

    Go to contribution page
  10. Mason Proffitt (University of Washington (US))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The ABCD method is a common background estimation method used by many physics searches in particle collider experiments and involves defining four regions based on two uncorrelated observables. The regions are defined such that there is a search region, where most signal events are expected to be, and three control regions. A likelihood-based version of the ABCD method, also referred to as the...

    Go to contribution page
  11. Grigory Rubtsov (INR RAS)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Baikal-GVD is a large scale underwater neutrino telescope currently under construction in Lake Baikal. The experiment is aimed at the study of the high-energy cosmic neutrinos and the search for their sources. The principal component of the telescope is the three-dimensional array of optical modules (OMs) which register Cherenkov light associated with the neutrino-induced particles. The OMs...

    Go to contribution page
  12. Prof. Ivan Kisel (Johann-Wolfgang-Goethe Univ. (DE))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Within the FAIR Phase-0 program the algorithms of the FLES (First-Level Event Selection) package developed for the CBM experiment (FAIR/GSI, Germany) are adapted for online and offline processing in the STAR experiment (BNL, USA).

    Long-lived charged particles are reconstructed in the TPC detector using the CA track finder algorithm based on the Cellular Automaton. The search for...

    Go to contribution page
  13. Lea Reuter (Institut für Experimentelle Teilchenphysik (ETP), Karlsruher Institut für Technologie (KIT), Germany)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Learning the hierarchy of graphs is relevant in a variety of domains, as they are commonly used to express the chronological interactions in data structures. One application is in Flavor Physics, as the natural representation of a particle decay process is a rooted tree graph. 
    Analyzing collision events involving missing particles or neutrinos requires knowledge of the full decay tree....

    Go to contribution page
  14. Henry Fredrick Schreiner (Princeton University)
    Track 1: Computing Technology for Physics Research
    Poster

    Histogramming for Python has been transformed by the Scikit-HEP family of libraries, starting with boost-histogram, a core library for high performance Pythonic histogram creation and manipulation based on the Boost C++ libraries. This was extended by Hist with plotting, analysis friendly shortcuts, and much more. And UHI is a specification that allows histogramming and plotting libraries,...

    Go to contribution page
  15. Gordon Watts (University of Washington (US))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    ServiceX is a cloud-native distributed application that transforms data into columnar formats in the python ecosystem and ROOT framework. Along with the transformation, is applies filtering, and thinning operations to reduce the data load sent to the client. ServiceX, designed for easy deployment to a Kubernetes cluster, is runs near the data, scanning TB’s of data to send GB’s to a client or...

    Go to contribution page
  16. Sitong An (CERN, Carnegie Mellon University (US))
    Track 1: Computing Technology for Physics Research
    Poster

    Deep neural networks are rapidly gaining popularity in physics research. While python-based deep learning frameworks for training models in GPU environments develop and mature, a good solution that allows easy integration of inference of trained models into conventional C++ and CPU-based scientific computing workflow seems lacking.

    We report the latest development in ROOT/TMVA that aims to...

    Go to contribution page
  17. Paul Gessinger (CERN)
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    The great success of the Tracking Machine Learning Challenges (TrackML) contracted in two phases (accuracy phase from April to August, throughput phase from September to November 2018) has proven the need of an easy accessible and yet challenging dataset for algorithm design and further R&D. The released TrackML dataset is to date heavily used by several research groups at the forefront of...

    Go to contribution page
  18. Gene Van Buren (Brookhaven National Laboratory)
    Track 1: Computing Technology for Physics Research
    Poster

    A unique experiment was conducted by the STAR Collaboration in 2018 to investigate differences between collisions of nuclear isobars, a potential key to unraveling one of the physics mysteries in our field: why the universe is made predominantly of matter. Enhancing the credibility of findings was deemed to hinge on blinding analyzers from knowing which dataset they were examining,...

    Go to contribution page
  19. Mr Dennis Noll (RWTH Aachen University (DE))
    Track 2: Data Analysis - Algorithms and Tools
    Poster

    Many HEP analyses are adopting the concept of vectorised computing, often making them increasingly performant and resource-efficient.
    While a variety of computing steps can be vectorised directly, some calculations are challenging to implement.
    One of these is the analytical neutrino reconstruction which involves fitting that naturally varies between events.

    We show a vectorised...

    Go to contribution page
Building timetable...