Introduction to the basics of physics analysis at the CMS experiment. Some of the tools used to visualize and analyze the data CMS data are used.
The students have reconstructed the Z boson and other particles from their decay products.
Quench protection systems of the LHC superconducting circuits overview – why such systems were built and what they do. The quench phenomenon is discussed as well as the methods used at CERN to fight against it.
The focus is on the energy extraction facilities as a part of the quench protection mechanism.
The students have developed a small Labview application to extract the temperature values...
The main principal of the operation of the Resistive Plate chambers (RPC) and the base technique for analyzing the data is discussed. The simulation of a small MC sample using CMSSW (CMS Software), including all the simulation steps starting from event generation up to the reconstruction step. The obtained sample is used to evaluate the predicted hit efficiency of the RPC chambers. The MC...
Introduction to the CERN computer centre management infrastructure. Namely, the automation of the CC with Puppet and the software components for managing and distributing of secrets and certificates on the host. A Study on couple of potential replacements of the the secrets' storage tools. Installation and configuration of the tools, tests of different data encryption techniques and interfaces...
Project 1: Introduction to the GRID technology and how it is utilized in the CMS experiment, the computing infrastructure that lies beneath the complicated software packages used for physics analysis, and the submission tools developed for aiding the work of the physicists.
Project 2: Introduction to the Computer-Aided Design (CAD), the Enterprise Asset Management (EAM) and the Engineering...
ISOLTRAP is a high-precision mass spectrometer at the radioactive ion-beam facility ISOLDE/CERN, which uses ion traps to measure the masses of short-lived radioactive isotopes for nuclear structure, astrophysics and weak-interaction studies. Currently, the ISOLTRAP team uses Time-of-Flight Ion-Cyclotron-Resonance, Multi-Reflection Time-of-Flight Mass Separation and Phase-Imaging...
Overview of the distributed computing operations of the biggest high-energy physics experiment - ATLAS. The concepts and technologies needed for operation of 500 000 computing cores used by 3000 users will be discussed. A toy-monitoring for 19 000 core computing farm was developed.
Basics of the physics simulation used in a large modern experiment. Cutting-edge tools were used for this work. Visualization of simulated events in ALICE, which are used by the physicists in their data analysis.
Introduction to the principles of distributed computing, providing CPU and storage needed to process and analyze hundreds of petabytes of data generated by the LHC. Demonstration of software tools used in the ALICE distributed computing infrastructure and development work of the next-generation software used to synchronize the work of hundreds of computing centers worldwide.
Search for strange particles, produced from collisions at LHC and recorded by the ALICE experiment. The task is based on the recognition of their V0-decays, such as Kos → π+π-, Λ→ p + π- and cascades, such as Ξ- → Λ + π- ( Λ → p + π-). The identification of the strange particles is based on the topology of their decay combined with the identification of the decay products; the information from...