- Compact style
- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
The goal of this workshop is to give an opportunity to students and young researchers to give presentations and to write proceedings. A morning plenary session, followed 15+10 minute presentations, shall be the format across the three days. The topics to be covered will be high-energy theory and phenomenology (heavy ions, pp, ep, ee collisions), ATLAS physics and ALICE physics. Given the format, It is expected that all students will have a chance to present to their peers and senior physicists.
Heavy ion collisions at RHIC and at the LHC produce an enormous amount of energy that enables the nuclei and its constituent particles to melt, thus releasing gluons, quarks and anti-quarks, travelling in different directions with different momenta. Studies of these collisions have shown that low transverse momentum observables describe a strongly coupled plasma (quark-gluon plasma), an almost perfect liquid that evolves hydrodynamically and flows with almost no viscosity. We make predictions for the suppression of the heavy flavor mesons that these heavy quarks decay to and thus describe the energy loss of these heavy quarks as they interact with the plasma; we show that these predictions are in good agreement with experimental data.
Chemical and thermal equilibrium properties of infinite relativistic hadron matter are investigated using a microscopic transport model. This model is used to simulate the ultra-relativistic heavy ion collisions at different energy densities ε, namely the Ultra-relativistic Quantum Molecular Dynamics (UrQMD). The molecular dynamics simulation is performed for a system of zero baryon number density and light meson species (π, ρ and K) in a box with periodic boundary conditions. The equilibrium state is investigated by studying the chemical equilibrium and the thermal equilibrium of the system. The particle multiplicity equilibrates with time, and the energy spectra of different light mesons species have the same slopes and common temperatures when thermal equilibrium is reached. This study shows the results of a full analysis of both chemical and thermal equilibrium before and after the system has reached the equilibrium state at different energy densities.
Gauge-Higgs unification models give interesting solutions to the hierarchy problem in particle physics. The common study of this type of model is done by using a decomposition of 5-dimensional particles in 4-dimensional Kaluza-Klein modes, which is a handy way to compute the infinite sums appearing in the model. In order to take into account the running of coupling constants in these models, we propose a different decomposition using winding modes around the fifth dimension, which is compactified. This decomposition not only permits us to take running into account, but may also give a faster converging series in all the quantities when summing over these modes.
Composite Higgs models describe a strongly coupled gauge fermion sector which extends the Standard Model, introducing the Higgs boson as a new bound state arising due to the breaking of a global (flavour) symmetry. These models will be accompanied by light states generated by the same dynamics, the detection of which may present the first signs of compositeness. The subject of this work, a pseudo-scalar resulting from the breaking of a U(1) symmetry, is one such state. We study the phenomenology of this scalar, making a case for targeted low mass searches at future lepton colliders, with a focus on production of the pseudo-scalar at the FCC-ee collider with a subsequent decay to a di-tau pair.
The production of a single top quark in association with a $W^{\pm}$ and $Z$ boson ($tW^{\pm}Z$) is sensitive to both the neutral and charged electroweak couplings of the top quark as the process involves the simultaneous production of a W boson and a Z boson in association with the top quark. However the process so rare that it has never been observed by any particle physics experiment. The latest datasets recorded by ATLAS are sufficiently large to allow a potential observation of the process. This talk will detail an ongoing analysis of the full run II ATLAS dataset in order to measure the tWZ process. The talk will focus on the development of a technique to kinematically reconstruct hadronically-decaying W bosons with a machine-learning based approach and the optimisation of lepton pt criteria in the basic event selection.
The LHC is a top quark factory and the copious amounts of top quarks produced provide valuable insight into the standard model and beyond. Majority of the top quarks produced can be identifed using standard methods such as identifying features such as bottom quarks (b-tagging), W bosons or three jets with an invariant mass that is roughly equal to the top mass. However, some of the top quarks will be highly boosted and thus the decay products will be collimated into single jets. This will hinder the standard methods and consequently, subjet analysis is the natural next step. A brief investigation was conducted to determine if it was possible to distinguish top quarks from background by looking at the spatial distribution and number of subjets within a large-radius jet. Whilst the simplicity of the analysis severely hindered any viable results, valuable insight into the nature of top quark and QCD jets was obtained. As expected, the QCD background jets tended to be more closely spaced together (due to the jets originating from high pT partons that shower into many soft and collinear particles) and the top quark jets were more separated (due to the distinct decay products).
We investigate the discovery potential of a Stealth SUSY scenario involving squark decays by reconstructing the lightest neutralino decay products using a large-radius jet containing a high transverse momentum photon. Requirements on the event topology, such as photon and large-radius jet multiplicity result in less background than signal. We also estimated the sensitivity of our analysis and found that it has a better exclusion potential compared to the strongest existing search for the specific benchmark points considered here.
Recent studies in particle physics have shown that there are myriad possibilities for strong dark sector studies at the LHC. One signature is the case of semi-visible jets, where parton evolution includes dark sector emissions, resulting in jets overlapping with missing transverse energy. The implementation of semi-visible jets is done using the Pythia Hidden valley module to duplicate the dark sector showering. Owing to the unusual MET-along-the-jet event topology which is yet an unexplored domain within ATLAS, this search focuses on the performance and optimization challenges associated with such a unique final state, specifically looking at the small angle difference between the hardest jet and the missing transverse energy, and targeting a cut-and-count strategy.
The standard electron and jet reconstruction process using information from energy deposits in the electromagnetic (EM) and hadronic calorimeters are carried out independently. This results in an ambiguity in the reconstruction of these objects. to avoid such ambiguity, an overlap removal procedure is applied during electron and jet reconstruction since every reconstructed electron will have a close-by jet associated with it, that needs to be removed to avoid double counting of these objects. Also if the electron has many more real jets close to it, the electron is discarded. This implies that the standard electron reconstruction process requires some level of isolation from close-by hadronic activity and as a result, it becomes inadequate for a boosted topology, where electron is close to a real jet, such as the boosted heavy neutrino analysis. This is because, in a boosted regime, the electron can end up inside a real jet. This becomes a problem if we want to keep both the electron and the jet because the standard electron reconstruction procedure results in a severe drop in the identification efficiencies making the standard efficiency scale factors inadequate for such topology. The ID variables for an isolated electron and an electron in jet are presented. These are the variables used in electron identification in the different likelihoods. The variables that were found to be robust against nearby hadronic activity are shown. These are the variables that can be used to reconstruct an electron close to a jet.
Due to a number of features from proton-proton collisions taken during Run 1 data taking period at the LHC, a boson with a mass around the Electro-Weak scale was postulated such that a significant fraction of its decays would entail the Standard Model (SM) Higgs boson and an additional scalar, S. One of the phenomenological implications of a simplified model, where S is treated a SM Higgs boson, is the anomalous production of leptons at the LHC. These discrepancies appear in Run 2 data in corners of the phase-space predicted above. In these corners of the phase-space different SM processes dominate, indicating that the potential mismodeling of a particular SM process is unlikely to explain them. After summarising the anatomy of the anomalies, the talk will concentrate on the implications for measurements at the LHC. This will include, but will not be limited to, Higgs boson and top-mass related measurements. Implications in astro-particle and radio astronomy will be discussed.
Anomalies observed in several Standard Model (SM) results, with multiple leptons in the final state from the ATLAS and CMS experiments at the LHC, are interpreted in the context of new physics in Refs. arXiv:1711.07874 and arXiv:1901.05300. This new hypothesis extends the SM considering the presence of additional bosons through the production of a heavy boson, $H$, decaying into a SM Higgs boson, $h$, and a singlet scalar, $S$, which is treated as a SM Higgs-like boson. In this work the impact of the new physics on measurements of the SM Higgs boson produced in association with a $W$ boson using Run 1 and Run 2 datasets by the LHC experiments is studied. The Higgs decay modes considered here include $h\rightarrow WW,\tau\tau,\gamma\gamma$ and the associated vectorial boson enriches the studied final states with leptons or hadrons. The overall combination of the observed measurements results in a signal strength of $2.51 \pm 0.43$ which corresponds to a deviation from the SM value of unity of 3.5$\,\sigma$. This result is consistent with the previous observed discrepancies in final states with multiple leptons and further supports the possible existence of new physics at the LHC.
With focus on the recent ATLAS search for top associated Higgs production in multi-lepton final states, an anomalous rate for the ttW background is unearthed and quantified in terms of theory uncertainties. This anomalous rate is explored in the context of the previously published multi-lepton anomalies at the LHC (JHEP 1910 (2019) 157), using a simplified new physics model. The impact of the model in ttZ measurements is also determined and is shown to be consistent with a mild enhancement of events with low Z transverse momentum. Interesting variables are considered for suggested for future experimental searches to constrain the anomalous ttW rate.
As no definite signs of new physics has been observed at the LHC data yet, alternate approaches have been proposed. These include looking at unusual topologies, and using existing measurements to constrain models (CONTUR). In tis overview, I will discuss some of the recent developments along these directions, covering jet substructure methods to identify semivisible jets, a realistic detector smearing applicable for substructure, and reverse engineering CONTUR to predict how sensitive a measurement will be for BSM scenarios.
This project studies a robust anti-QCD tagger with mass de-correlating jet image data produced using the pre-processing method introduced in arXiv: 1903.02032. A semi-supervised (where data is only trained on background) learning anomaly detection approach using convolutional autoencoder neural networks is explored as an anti-QCD tagger in this study. Jet image data is used to train the algorithm instead of conventional high level multivariate observables. The pre-processing steps perform momentum re-scaling to make all jets have the same mass thus mass de-correlating the jets, Lorentz transformation to make all jets have the same energy and remove the residual rotation by applying the Gram-Schmidt on the plane transverse to the jet axis. This is expected to increase the sensitivity of the autoencoder to non-hypothesised resonance and particles as it will not experience non-linear correlation of the jet-mass with other jet observables.
We propose a new approach to search for new resonances beyond the Standard Model (SM) of particle physics in topological configurations using Machine Learning techniques. This involves a novel classification procedure based on a combination of weak-supervision and full-supervision in conjunction with Deep Neural Network algorithms. The performance of this strategy is evaluated on the production of SM Higgs boson decaying to a pair of photons inclusively and exclusive regions of phase space, for specific production modes at the Large Hadron Collider (LHC), namely through the gluon-gluon fusion, the fusion of weak vector bosons, in associated production with a weak vector boson, or in association with a pair of top quarks. After verifying the ability of the methodology to extract different Higgs signal mechanisms, a search for new phenomena in high-mass diphoton final states is setup for the LHC.
Unlike supervised learning which is known to assume a full knowledge of the underlying model, weak supervision allows with partial knowledge to extract new information from the data.
The objective of this study is to set up the search for heavy resonances at the electroweak scale with topological requirements. These resonances are expected to be produced with different production mechanisms. In this case we will be focusing on the searches for new resonances in the di-photon final state. The performance of weak supervision methodology will be tested in the production of the Higgs boson in the Standard Model using deep neural networks. This will then be compared to the performance of the full supervision approach.
What is typically referred to as the inverse problem in High Energy Physics (HEP) can be described as the use of data to extract key information to build new a theory. The search of new resonances beyond the Standard Model (SM) involves the use of different Machine Learning techniques. For this purpose, based on the recent and major successes in the field of deep learning, particularly Deep Generative algorithms; Generative Adversarial Networks (GANs) which have been developed in less than a decade ago have proven to be of potential. The feasibility of addressing the inverse problem can be achieved via a combination of GANs and weak supervision. Weak supervision provides a way of combining the already known information about the backgrounds with the unknown hidden in the data, it is often used to extract features of the new Beyond the Standard Model signal from the data and with GANs used to create a Monte Carlo (MC) generator of the unknown signals with no significant loss in accuracy which could be better than classic MC.
In the search for new physics Beyond the Standard Model, MVA techniques are used to extract specific signal from Standard Model background processes. In this study weakly-supervised machine learning techniques are developed and evaluated using the ATLAS experiment, di-lepton (e±μ∓) final state data, in the H → Sh search. These weakly-supervised techniques use labelled background data to extract an unlabelled signal. This allows the classification of signal information without restrictions based on previously defined physics. This study uses TMVA with ROOT to evaluate the effectiveness of weakly-supervised techniques when compared to fully-supervision techniques using Boosted Decision Tree (BDT), Multilayer Perceptron (MLP) and Deep Neural Network (DNN) methods.
Motivated by the statistically significant excesses in the multi-lepton final states compatible with physics at the electroweak scale, here we attempt a direct search for a heavy scalar resonance in the Z and photon system in the LHC Run 2 dataset. The study aims to extract the signal process using a machine learning algorithm.
Satellite data enables the efficient mapping and monitoring of the earth’s resources, ecosystems, and events. Machine Learning can be applied to this data to predict weather conditions. Machine Learning techniques can be used to model and extract useful information out of a data stream. This helps governments and industries to share information, to make informed decisions, to act on time and to provide improved or totally new services. Machine Learning algorithms are used to scan huge data masses of satellite imagery and to develop models to extract features, detect changes and predict upcoming situations. Satellite data provided by Sentinel missions is to be scrutinized. The missions include radar and super-spectral imaging for land, ocean and atmospheric monitoring.
In this article we search for a heavy resonance decaying into two photons in association with $b$ jets. The search uses $139~\mbox{fb\(^{-1}\)}$ proton--proton collision data taken from the ATLAS detector at the centre-of mass energy of 13TeV during 2015 to 2018. Three models are tested in this final state. A Higgs boson like heavy scalar $X$ produced with top quarks, $b$ quarks or $Z$ boson decaying into $b\bar{b}$ are examined. Limits are set on these models for the resonance mass ranging from 180GeV-1.5TeV
The 4-lepton final state is a clean and important signal that is being studied at the ATLAS detector. In this study, we focus on four leptons originated from the $R\rightarrow SH\rightarrow 4\ell+E^{miss}_{T}$ signal. $R$ is a scalar boson produced via gluon--gluon fusion and decays to two lighter scalar bosons, $S$ and $H$. The $S$ decays to a pair of Standard Model of particle physics neutrinos. And thereof considered here as missing transverse energy, $E^{miss}_{T}$. The 4-lepton final state comes from the $H$ boson through the decay of the $ZZ$ bosons. The signal region looks at four leptons invariant mass, $m_{4\ell}$, greater than 200 GeV. This study helps to understand the nature of the considered background for the $4\ell+E^{miss}_{T}$ signal on a control region defined by $m_{4\ell}$(140-200) GeV. A comparison between the state-of-the-art Monte Carlo simulation for the background processes, and the data at an integrated luminosity of 139 fb$^{-1}$ is provided.
The Technology and Innovation Platform (TIP) is envisaged to promote applications spanning from the development of radiation detectors, special materials and development of industrial standard electronics. The TIP will be implemented at iThemba LABS to explore applications and partnership with industry for technology transfer purposes to benefit research and the economical sector. The TIP premises are currently being designed and the floor plan is in an advanced stage. An electronics lab, clean room, detector lab and design offices will accommodate the ongoing and upcoming projects. This presentation will give an overview of the facility and the research related projects.
Organisations worldwide are under pressure from investors to reduce the rise in costs and maintain profits leading them to come up with innovative solutions to solve traditional and new problems. Automation of processes and the use of analytics is key in achieving this objective. This talk aims to discuss how basic and advanced classroom concepts (Mathematics, Statistics, Machine Learning) are being used in industry to solve complex problems while maintaining costs and increasing profits.
The Transition Radiation Detector (TRD), part of the ALICE Experiment at CERN, is used for electron identification, triggering and tracking. This work presents a prototype of an event display, customised for the TRD, that provides a portable, projection based display of tracks, tracklets and raw data within the detector, outside the classic ROOT environment. The prototype provides a novel ability to view ADC level data associated with displayed tracklets, as well as a 3-dimensional interactive view. This work lays the foundation for development of future browser-based event displays, and provides guidance for user-centric design in this space.
Photomultiplier tubes are susceptible to radiation damage within high energy and nuclear physics detectors, particularly due to neutrons. More specifically, the integrity of the photocathode materials responsible for the emission of the primary electron that then interacts with the electron dynodes that create cascades of electrons moving through the photomultiplier, are affected. The photocathodes are made of low electron work function materials. We aim to assess the radiation damage and radiation hardness of different electron emitting materials suitable as photocathodes. The electron emission of the materials will be assessed before and after radiation with a setup based on electron microscopy which is being developed in the School of Physics of the University of the Witwatersrand. The materials are exposed to different gamma radiation doses. The Co-60 facility based at the CERN Prevessin site was used for the gamma irradiation. To compliment these measurements, we will also study the structural damage induced in the materials by using Raman Spectroscopy.
We report on the replacement of E3E4 (Crack) and refurbishment of Minimum Bias Trigger Scintillator (MBTS) counters as part of phase I upgrade for the ATLAS experiment at the European Organization for Nuclear Research. Crack and MBTS counters, located between the central and extended Tile Calorimeter barrels, are used for correcting the electromagnetic and hadronic energy responses, respectively. They are situated in close proximity to the beam axis of the detector. During Run 2 (2015-2018) data taking period of the LHC energy √s = 13 TeV, Crack and MBTS scintillators were severely degraded by radiation and had to be replaced. The phase I upgrade has commenced since the beginning of LHC long shutdown 2. The upgrade activities which were finalized with a strong contribution from South Africa comprised the assembly of Crack and MBTS counters, their qualification and characterization using radioactive sources (Sr90 and Cs137) and their installation on the ATLAS detector. The University of the Witwatersrand was previously involved in the radiation qualification and selection of the scintillator material to be used in the counter production.
A complete redesign of the detector electronics is currently taking place to accommodate the readout and trigger architecture to the future HL-LHC conditions. The Tile PreProcessor (TilePPr) will be the core of the TileCal off-detector electronics after the Phase-II Upgrade. The TilePPr is composed of several FPGA-based boards including Tile Compact Processing Module (TileCPM) to operate and readout the on-detector electronics. As part of the TilePPr module, the Gigabit Ethernet switch (TileGbE) mezzanine board will provide network communication to all the different submodules, and the Tile Computer on Module (TileCoM) mezzanine will be used to remotely configure the on-detector electronics and TilePPr FPGAs as well as, to interface the ATLAS DCS system providing monitoring data. The first version of the deployment of an embedded Linux for the ZYNQ System-on-Chip (SoC) of TileCoM has been built and running. The University of the Witwatersrand is involved in the production of the TilePPr modules including the TileGbE switch and TileCoM.
CERN will be undergoing an upgrade to the High Luminosity Large Hadron Collider. To connect the Phase 2 upgrade to the CERN network, a Gigabit Ethernet (GbE) switch mezzanine board is designed as a part of the Tile-PreProcessor (TilePPr). The boards that are being designs will undergo a variety of tests to determine their suitability for the Phase 2 upgrade at CERN. A variety of testing and development is required to get the electronic systems ready for the Phase 2 upgrade. A test bench is being developed as a testing environment for the GbE switch before integration. Next, the communication with each of the 16 ports is to be tested. Each of these ports are assigned to a component in the Phase 2 electronic system. The temperature and power measurements need to be tested to ensure that the board can operate safely and sustainably. Finally, a firmware and software upgrade of the switch is required of which, a prototype is already working.
The Tile Calorimeter (TileCal) is the central hadronic calorimeter ($|\eta|$ $<$ 1.7) of the ATLAS experiment, made out of iron plates and plastic scintillators. The TileCal is divided into three cylinders along the beam axis, each of which is azimuthally segmented into 64 wedge-shaped modules, staggered in the $\phi$ direction. TileCal online software is a set of Trigger and Data Acquisition (TDAQ) software, and its main purpose is to readout, transport and store physics data originating from collisions at the Large Hadron Collider (LHC). The ATLAS Local Trigger Interface (ALTI) module is a new electronic board, designed for the ATLAS experiment at CERN, a part of the Timing, Trigger and Control (TTC) system. It is a 6U VME module which integrates the functionalities of four legacy modules, currently used in the experiment: Local Trigger Processor, Local Trigger Processor interface, TTC VME bus interface and the TTC emitter. ALTI module will provide the interface between the Level-1 Central Trigger Processor and the TTC optical broadcasting network to the front-end electronics of each of the ATLAS sub-detectors. There is a need to develop and integrate the ALTI software in the Tile online software. Performance tests and maintenance of the ALTI module software will be carried out during the second half of the Long Shutdown 2 (Dec 2018 - Apr 2021) period, in preparation for Run 3 (May 2021-2024) data-taking period.
The large-scale production of the LVPS bricks will involve the complete replacement of all power supply “bricks” in the TileCal (Tile Calorimeter) front-end electronics for the LHC-HL upgrade. A total of 1024 LV bricks (half needed for the entire detector) will be produced by the University of the Witwatersrand. Such an operation comprises of several steps which include the development of two new custom quality assurance test stations. The initial test station will quantify a multitude of performance metrics of a LVPS brick, whereas the Burn-In test station would perform an endurance type test and subject the LVPS brick to a stressed environment. Both these custom test stations ensure the reliability and quality of a new LVPS which will power the next generation of the upgraded hardware system of ATLAS at CERN
School of Physics and Institute for Collider Particle Physics, University of the
Witwatersrand, Johannesburg, 2050, South Africa
thabo.james.lepota@cern.ch
This paper describes the development of test stations at the University of the Witwatersrand for the ATLAS Tile Calorimeter Low Voltage Power Supplies of the Large Hadron Collider. As part of phase II cycle, South Africa will produce and test, half of the LVPS bricks that will power up front-end electronics of the detector. The Burn-in station required to detect early failures in components thereby increasing component reliability. Here we describe the design and development of the burn-in station for the electronic boards.
The upgrade of the ATLAS hadronic tile-calorimeter Low-Voltage Power Supply (LVPS) falls under the high-luminosity LHC upgrade project. This presentation serves to provide a detailed overview of performance testing of an upgraded LVPS component known as a brick being undertaken by the Wits High energy physics institute in preparation for full-scale production within South Africa. This testing involves two distinct test stations known as the Production initial test station and the Burn-in station, both of which are being constructed at Wits. These function to quantify various performance metrics in order to enforce stringent quality control of the LVPS bricks before installation within ATLAS and will be covered in detail.
A functional material of carbon nano-composite is investigated to be utilised as a Thermal Interface Material (TIM) in heat sink applications. The TIM is a composite in a pasty form, based on carbon nanomaterials and Silicone heat transfer compound. The goal behind the implementation of the carbon nano-material in the TIM is to increase the thermal transfer from the electronics to the heat sink by the intermediary of aluminium oxide (AlO) posts. The main nano-materials investigated in this research work are the carbon nanospheres (CNS) of 450 nm diameters produced by Chemical Vapor Deposition (CVD). The study included also the investigation of the hollow carbon nanospheres (HCNS) and carbon nanotubes (CNT) in the composite. The heat transfer efficiency of the nano-composite is investigated by varying the ratio of the carbon nanomaterials within the composite, and the temperature flow in a duration of time to the heat sink with and without the carbon nanomaterials in the TIM is measured and compared.
The struggling global economy and the constant increase in population is leading to a very competitive era for both industry and academia. It has become quite challenging breaking into industry without experience, especially as a fresh graduate or an academic. Industry is evolving slightly faster than academia in order to cope with business needs. This talk aims to discuss about how I made the transition from a graduate student to industry. It also aims to give tips and tools that will help you better prepare and stay competitive while completing your studies.