Canadian Astro-Particle Physics Summer Student Talk Competition

Canada/Eastern
Day One (Monday, 23 August 2021) - 10:45am ET Start https://queensu.zoom.us/j/96737058866?pwd=OWlCcVRpTnYwSFYxVFNXVFVuanBKQT09 Meeting ID: 967 3705 8866 Passcode: 593378 Day Two (Tuesday, 24 August 2021) - 12:30pm ET Start https://queensu.zoom.us/j/95042947884?pwd=bHNzbFFEWnFZSDZaSHpwWUEwdVZ0dz09 Meeting ID: 950 4294 7884 Passcode: 919290
Benjamin Tam (Queen's University), Christine Kraus (Laurentian University)
Description

CASST: The Canadian Astro-Particle Physics Summer Student Talk Competition!

Registrations for presenters and abstract submissions are now closed! Feel free to register as a viewer. More information regarding the programme will be forthcoming to registrants.

All summer students who are not currently engaged in graduate studies or have a graduate degree are encouraged to attend. All talks will be 8-10 minutes (+questions). A final programme will be posted following registrations. All talks will be held virtually.

Please self-identify the main focus of your talk as.:

  1. Simulations and Data Analysis
  2. Hardware and Instrumentation
  3. Theory
  4. Engineering
  5. Science Communication and Outreach (including education)

Two competitors will be granted an all-expenses-paid opportunity to present at the 2022 Canadian Association of Physicists (CAP) Congress, the prestigious annual meeting of all Canadian physicists. The 2022 CAP Congress will occur June 6-10 at McMaster University in Hamilton, Ontario.

The competition is co-sponsored by SNOLAB and the McDonald Institute.

Award Winners:

1st Place | Julia Azzi

2nd Place | Emma Klemets

3rd Place | Abbygale Swadling

4th Place | Clara Mitchenson

4th Place | Alexander Pleava

6th Place | Anthony Allega

Registration
Registration FOR NON-PRESENTERS
Registration FOR PRESENTERS
    • 1
      Opening Remarks
      Speaker: Benjamin Tam (Queen's University)
    • Session 1

      10+3 talks.
      11am ET Monday Morning

      Conveners: Ana Sofia Inacio (Laboratório de Instrumentação e Física Experimental de Partícul), Leon Pickard
      • 2
        K40 Backgrounds in the SNO+ Neutrino Detector

        SNO+ is a scintillator filled neutrino detector located 2 km underground at SNOLAB. The primary goal of the SNO+ experiment is to search for neutrino-less double beta decay (0vbb). The rarity of this phenomenon necessitates a high level of sensitivity making background analysis crucial. In this presentation I will outline the methods used to find and characterize K40 backgrounds in SNO+ – a signal that is notoriously difficult to measure given its statistical rarity. The methods used in this study allowed for the first-ever direct measure of the K40 background taken from data collected by a neutrino detector partially filled with liquid organic scintillator. The applicability of this measurement is extensive – in addition to providing a quantitative measure of the K40 background, this signal can also be used to calibrate the SNO+ neutrino detector in an energy range where there are no other reliable sources.

        Speaker: Parmesh Ravi (University of Waterloo)
      • 3
        Dark matter search and calibration of detectors at Université de Montréal

        The detection and study of dark matter is a major challenge in modern physics. Minimizing background noise and improving detector technologies is currently an international race among astro particle physicists. SuperCDMS’s (Super Cryogenic Dark Matter Search) strategy is to detect very low energy deposits using germanium and silicon detectors. To efficiently calibrate the detectors, Université de Montréal is hosting the IMPACT experiment (Ionization Measurement with Phonons At Cryogenic Temperatures), which will tell us about nuclear recoils at energy ranges lower than what was ever tested. During the summer, using measurements and simulations, I evaluated the impact of a germanium target on neutron scattering using a He3 neutron detector and a 4.8 keV neutron beam. This measurement is critical for the proper conduct of the experiment and the calibration of the germanium HVeV detectors to verify their sensitivity to dark matter.

        Speaker: Audréanne Matte-Landry (Université de Montreal)
      • 4
        Lab Weather Station

        Sensitive experiments often require strict regulations to ensure that the surrounding environment is clean. The goal of the project is to record environmental variables continuously over years with the use of various sensors such as a particulate counter or pressure differential sensor, to then upload the data to a server using a Raspberry Pi. Using various python submodules specific to each sensor, properties such as temperature, pressure differential, humidity, and dust levels are monitored and uploaded to a database at periodic time intervals. This would allow the user to verify whether the lab has a positive pressure differential relative to the outside to keep the dust out and to correlate when and how the dust particles enter. A 3d-printed case has been designed to package all the components together.

        Speaker: Felix Belair (McGill University)
      • 5
        Cherenkov Light Simulations with Chroma for nEXO’s Muon Veto

        nEXO is a proposed neutrinoless double beta decay experiment currently in the design stage. nEXO is expected to reach a half-life sensitivity of approximately $10^{28}$ years, which requires ultra-low backgrounds. To achieve this, the plan is to build the detector 2 km underground at SNOLAB to reduce backgrounds from cosmic radiation, as well as use a water-Cherenkov muon veto, the focus of this summer project. The muon veto, also known as the outer detector (OD), is a large tank of ultra pure water. When high energy cosmic muons pass through the outer detector, they produce UV photons in a process called Cherenkov radiation. These photons are detected by photomultiplier tubes (PMTs) inside the water tank and let us tag the muons passing by. The muons’ energy signals can then be removed from the total data recorded from nEXO’s main inner detector to get cleaner results.

        Before finalizing the design of the outer detector, simulations need to be run to optimize the placement of the PMTs that will detect the Cherenkov light. This summer, we are adapting Chroma, a ray tracing program already being used for nEXO’s inner detector, to simulate the light propagation in the outer detector. Chroma, along with being easily editable for new detector set ups, also aims to be faster than most comparable Geant4 simulations, which is the C++ Monte Carlo software used for most of nEXO’s current simulations. This summer work has mainly focused on implementing the required theory to generate Cherenkov photons with the correct properties for Chroma to then propagate. The outer detector geometries and material optical properties have also been added into Choma. The final work of the summer will be running large simulations and further developing the analysis pipeline.

        Speaker: Emma Klemets (McGill University, UBC)
      • 6
        Hunting Unicorns: Model-Free Detection of Time Series Anomalies in GW Detector Data

        The data from current gravitational wave detectors, including Advanced LIGO and Advanced Virgo, contains a high rate of “glitches”: transient noise that can obscure true detections of gravitational wave events. I propose a novel method for identifying and characterizing these noise events by leveraging the Temporal Outlier Factor (TOF): a technique that uses higher-dimensional embedding to recreate the dynamical phase space of a time series, then correlates spatial and temporal clustering to find unique anomalies in the data. I present preliminary results demonstrating TOF’s efficacy in detecting known glitch classes in Advanced LIGO detector data, then discuss potential applications for detecting unknown glitch types.

        Speaker: Mr Julian Ding (University of British Columbia)
      • 12:15
        Break
    • Session 2

      10+3 talks
      12:30pm ET Tuesday

      Conveners: Ana Sofia Inacio (Laboratório de Instrumentação e Física Experimental de Partícul), Leon Pickard
      • 7
        Ab Initio Correlation between Double Gamow-Teller Transitions and Neutrinoless Double Beta Decay

        Neutrinoless double-beta ($0\nu\beta\beta$) decay is a beyond-Standard-Model process in which two neutrons decay into two protons and two electrons, without any neutrinos in the final state. If this process is observed experimentally, this would violate lepton number conservation and demonstrate that the neutrino is a Majorana fermion, i.e., its own antiparticle. In order to interpret experimental searches, accurate calculations of $0\nu\beta\beta$ decay nuclear matrix elements (NMEs) are essential, and recent advances in first-principles, or \textit{ab initio}, methods now allow calculations of NMEs with reduced uncertainty [1]. Additionally, a recent study demonstrated that $0\nu\beta\beta$ decay and double Gamow-Teller (DGT) transition NMEs, calculated using phenomenological techniques, exhibit a strong linear correlation [2].

        In this presentation, I will discuss the correlation between $0\nu\beta\beta$ and DGT NMEs calculated using the \textit{ab initio} valence space in-medium similarity renormalization group (VS-IMSRG) approach. In particular I calculate NMEs for lower fp-shell isotopes with $20 < Z < 24$ and $44 < A < 62$, and for select heavier isotopes including $^{76}$Ge, $^{82}$Se, $^{130}$Te, and $^{136}$Xe using several $NN + 3N$ interactions. These results show a strong, linear correlation between DGT decay and $0\nu\beta\beta$ decay NMEs, indicating that DGT decay, a process that is allowed by the Standard Model, can be used to experimentally constrain the NMEs of $0\nu\beta\beta$ decay, a beyond-Standard-Model process.

        References
        [1] A. Belley, C. G. Payne, S. R. Stroberg, T. Miyagi, and J. D. Holt, “Ab Initio Neutrinoless Double Beta Decay Matrix Elements for $^{48}$Ca, $^{76}$Ge, and $^{82}$Se”, Phys. Rev. Lett. 126, 042502 (2021).
        [2] N. Shimizu, J. Menéndez, and K. Yako, “Double Gamow-Teller Transitions and its Relation to Neutrinoless $\beta\beta$ Decay”, Phys. Rev. Lett. 120, 142502 (2018).

        Speaker: Izzy Ginnett (TRIUMF)
      • 8
        Evaluation C14 background in the SNO+ detector

        SNO+ is a neutrino detector where the active medium is a liquid organic scintillator (LAB). It is important to understand the backgrounds in detail. C14 is a source of background events in the SNO+ detector. C14 decays can be observed homogeneously throughout the LAB in the detector. The detector is now completely filled with LAB and the process of adding a wavelength shifter (PPO) is ongoing. This leads to mixing effects and possibly uneven distribution of the wavelength shifter. Changes in these effects over time and position are therefore important to monitor. This presentation will show this analysis for recent data.

        Speaker: keegan paleshi (Snolab)
      • 9
        Machine Learning for Top Quark Pair Kinematic Reconstruction

        Studies of the top quark provide unique insights into the Standard Model due to its large mass. However, the kinematics of $t\bar{t}$ decays is difficult to reconstruct due to the complexity of these events and limited detector resolution. Neural networks are thought to perform as well as state-of-the-art statistical algorithms for reconstruction purposes.

        Our group has developed a machine learning package called AngryTops, a BLSTM neural network that reconstructs $t\bar{t}$ decay pair kinematics resulting from 13 TeV $pp$ collisions. Although the package successfully reconstructs the kinematic variable distributions, we lack a systematic way to evaluate the network's performance on individual events. We implement improvements to better characterise the network's performance. We also introduce an algorithm that matches the observed and truth four-momenta. The variables used for matching are then used to filter the training dataset, retaining only events that the network should be able to reconstruct well.

        We train the network on the filtered dataset and evaluate how the network performs compared to the original. Further developments including data augmentation and fine-tuning parameters will be investigated using the matching algorithm as a performance metric.

        Speaker: Maggie Wang (University of Toronto)
      • 10
        Measurements of Radon Concentration in SNOLAB air

        Underground at SNOLAB, there are many ongoing experiments, such as SNO+, REPAIR, SuperCDMS etc, that monitor and are extremely sensitive to background radiation. One of the unwanted sources of background radiation that is present at SNOLAB is the naturally occurring radioactive gas, radon-222. Radon decays from Uranium which is found in small amounts in all rocks, and as it further decays into Polonium-218 and Bismuth-214 it emits alpha and beta particles. These particles are unwanted sources of background radiation that contribute noise and clutter the results of these detectors. Therefore, it is important to monitor the radon concentration levels underground at SNOLAB in order to account for this background radiation. One of the detectors used to monitor radon concentrations underground is the RAD7 detector. Through the analysis of the RAD7 data, it was discovered that fluctuations in the mine ventilation underground correlated to changes in the detected radon concentration, however, the mine ventilation data is not readily available and accessible for analysis. Therefore, I analyzed the radon concentration data and compared it to other variables that are constantly being monitored underground (pressure, drift flow rate, temperature etc.), in hopes of finding a correlation/explanation for the fluctuating radon concentration levels detected underground at SNOLAB. This analysis will be presented in this talk.

        Speaker: Matt Stollman (SNOLAB)
      • 11
        Using Machine Learning to Denoise Acoustic Pulses

        The project I am working on is in Simulations and Data Analysis. I am working in the GeRMLab to use machine learning to remove electronic noise from pulses collected with a germanium semiconductor detector. Clean pulses were both simulated and approximated by mathematical functions. Flat noise was added to these pulses, and these, along with noisy pulses from the detector, were used to train, validate, and test the denoising model.

        I am working this summer on transferring this method to acoustic pulses collected with the PICO dark matter experiment. I received data from PICO to train and test the model, and I used transformations to augment the dataset. I will then use the noisy acoustic pulses to train and optimize a model to denoise them.

        Speaker: Charlotte Reed
      • 13:45
        Break
    • Session 3

      10+3 talks
      2pm ET Monday

      Conveners: Matthew Stukel (Queen's University), Leon Pickard
      • 12
        An Analysis of C14 for SNO+

        SNO+ is a liquid organic scintillator detector aiming to study neutrinos, which is now completely full of scintillator with the addition of wavelength shifter addition ongoing.
        C14 is a background within the SNO+ detector that has been measured to yield a beta decay rate of 0.8 Hz/m^3 and 1.3 Hz/m^3 from simulation. This makes it a uniform, high rate and dependable source within the acrylic vessel (AV). The analysis of C14 can be used to determine global detector efficiency as well as give an understand of how PPO (wavelength shifter) loading can affect light yield with respect to time. Loading more PPO into the detector will cause an increase in the light yield from the C14. If C14 can be tracked within the AV, then you can also track the rate of PPO being added.

        Speaker: Victoria Howard (SNOLAB)
      • 13
        Analyzing Neck Events in the SNO+ Experiment

        The SNO+ experiment is located in Sudbury, Ontario, 2km underground to shield it from cosmic radiation in search for neutrinos - the ghost particle. Using the 780 tonnes of liquid scintillator, primarily Linear Alkylbenzene (LAB) that make up the SNO+ active volume, we observe different types of events in the detector, so detecting neutrinos requires very careful data selection to avoid spurious signals—which can change the expected theoretical result. One possible source of spurious events is the neck of the detector. This presentation will briefly discuss an analysis of neck events in SNO+ and their possible impact on SNO+ data.

        Speaker: Mr Kai Soini (SNO+ - Queen's University)
      • 14
        NEWS-G3 Muon Background Simulation

        NEWS-G searches for dark matter using spherical proportional counters (SPCs), which are large, gas-filled metal spheres with a small sensor at the centre held at high voltage. SPCs are used to detect the interaction of particles (such as dark matter or neutrinos) in the gas by measuring the charge induced on the sensor due to amplification of the primary ionization near the sensor. The NEWS-G3 project is a compact shield for a 60 cm SPC that consists of 8 layers of copper, lead, and polyethylene, one of which is muon veto designed to detect and eliminate the background caused by cosmic muons. This detector will be used to study the feasibility of coherent elastic neutrino-nucleus scattering (CEνNS) measurement at a nuclear reactor with a high neutrino flux. In this talk, I will present a summary of the NEWS-G3 project and my research on the simulation of the NEWS-G3 muon veto performance in Geant4, including estimates on the expected background caused by muons and the active time in the detector.

        Speaker: Clara Mitchinson (Queen's University)
      • 15
        Feature Recognition and labelling for Photogrammetry Calibration of the Super-Kamiokande Detector

        The Super-Kamiokande detector is a 40m tall cylindrical tank with a 40m diameter, filled with ultrapure water. It makes detailed measurements of solar, atmospheric, and accelerator neutrinos. About 11,000 PMTs (photomultiplier tubes) facing inwards are set up on the detector wall to record neutrino interaction events. The use of the accurate location of photomultiplier tubes (PMTs) on the detector wall will improve the accuracy of particle detection in the experiment. Over 15000 images (57GB) of SuperK were taken in with an underwater drone to reconstruct the locations of the PMTs using photogrammetry. For reconstruction, we first need to find PMT in an image then assign an ID to each PMT. In this study, we determined the location of the bolts surrounding each PMT using image processing techniques. Further processing was done on bolts to eliminate false bolts. After this, we performed pattern matching between images of detector wall and the image obtained from a perspective projection of PMTs in SK-detector to find an ID for each PMT. In this talk, I will present the methods used in our study for detecting and labeling PMTs in our image set.

        Speaker: Tapendra B C
      • 16
        PICO 500 Muon Veto Optical Calibration System

        The PICO 500 detector will continue the search for one of the leading dark matter candidates: the WIMP. Even under 6800 feet of rock shielding at SNOLAB, the PICO 500 bubble chamber is still exposed to a background of cosmic muons. The resulting interactions within the chamber’s active fluid are very difficult to discern from the extremely rare WIMP-nucleon interactions that the experiment will attempt to detect. The muon veto subsystem of PICO 500 uses PMTs to detect the Cherenkov light emitted by muons passing through the water tank surrounding the bubble chamber, and tags muon-related events so that they are not mistaken for a WIMP interaction. In order to accurately place the arrival of a muon on a timeline relative to a detector event, it is crucial for the muon veto to be precisely calibrated in the time domain. This talk will cover the design of the PICO 500 muon veto optical calibration system, including light source layout optimization and mechanical design, with an emphasis on using and generating very short nanosecond scale LED light pulses to calibrate the PMTs.

        Speaker: Laurie Amen (Queen's University)
      • 15:15
        Break
    • Session 4

      10+3 talks
      3:30pm ET Monday

      Conveners: Matthew Stukel (Queen's University), Leon Pickard
      • 17
        Improving Gravity Spy's Classification Accuracy of Real Gravitational Wave Events and Excess Noise Artifacts

        A key issue in the search for Gravitational Waves (GWs) by the Advanced Laser Interferometer Gravitational Wave Observatory (LIGO) and Advanced Virgo detectors surrounds the presence of excess noise transients (glitches) that pollute the detector data. These glitches come from a range of sources and can often mimic the form of real astrophysical events. To distinguish between real events and glitches efficiently and accurately, we turn to Gravity Spy, a machine learning program utilising citizen science effort. However, Gravity Spy has its limitations in terms of classification accuracy. Specifically, it has trouble classifying signals that come from black hole mergers on the very high and low mass range of the spectrum. We present our work to improve Gravity Spy and how its performance has progressed. This work hopes to ensure a safe and rapid method of event classification for the expected high candidate event rate of the next observing run.

        Speaker: Seraphim HSIEH JAROV (University of British Columbia)
      • 18
        URM Pressure Equalizing System

        SNO+ designed an Umbilical Retrieval Mechanism (URM) to deploy radioactive sources for the calibration of the detector. The sealed URM must maintain a near zero differential pressure between itself and the air pressure in the underground laboratory thus requiring a Pressure Equalizing System.

        The Pressure Equalizing System consists of a Cover Gas Bag designed to adjust its volume to counteract any pressure changes. The Bag has a "pillow" shape and flexseal coating for additional protection from the labs activities and to minimize potential failures. The frame for the Cover Gas Bag is designed with track rollers and guide rails to allow the bag to move freely while in operation to prevent creasing and indentation around the flange.

        Speaker: Ashley McCambley
      • 19
        Finding Functional Approximations of the Absolute Counting Efficiencies for radon 222 and radon 220 for ESC’s under certain Pressures and High Voltages

        Construction materials for equipment used in experiments contain natural radioactivity. High energy photons and beta rays towards the end of the uranium and thorium chains can act as a source of background for experiments like the original SNO experiment. To mitigate this, construction materials are required to be ultra low radioactivity materials. This requires methods to measure the radioactivity of materials. One such method is radon emanation measurements which involve an electrostatic counter (ESC) which is used to measure the number of decays of polonium isotopes from the uranium and thorium chains. Knowing the absolute counting efficiency makes it possible to find the number of a certain radioisotope originally present in a material. The aim of this summer job was to find functional approximations of the absolute counting efficiencies for ESC’s for the isotopes of radon 222 and radon 220 under pressures and high voltages of 25mbar and 600V, 100mbar and 600V, and 1000mbar and 1000V.

        Speaker: Drake Wickman (nEXO)
      • 20
        Particle Discrimination Using Machine Learning Techniques

        The PICO experiment utilizes a bubble chamber to search for weakly interacting massive particles (WIMPs). Events in the detector generate acoustic signals due to bubble creation caused by energetic incident particles, e.g., alpha particles, neutrons, gammas, or WIMPs. A use of these acoustic signals is to distinguish among the different events triggered by different types of incident particles, more specifically between alphas and neutrons.

        One tool being used to make this discrimination is the acoustic parameter (AP). This tool is highly effective and has an accuracy rate of up to 99%, but we do not fully understand how the AP emerges in the signal. Our goal is to use machine learning techniques to better understand how discrimination among signals from different types of incident particles can be modelled, and furthermore learn more about how the bubble grows.

        To do this we are designing a gradient boosted decision tree that will distinguish between different types of trigger particles based on specific sets of features found in the signals. The signals used in this analysis will come from both the PICO experiment as well as earlier experiments (specifically PICASSO) which had the same goals but used a superheated droplet technique instead of a bubble chamber. Our work involves finding a group of specific variables that the classification tree can use to obtain accuracy results comparable to a neural network being designed concomitantly by Kyle Yeates (also presenting a talk at CASST 2021), and through those variables be able to explain the physical properties of the signal that can be used to distinguish between different types of particles.

        Speaker: Megan McArthur (SNOLAB)
    • Session 5

      10+3 talks
      5:00pm ET Start

      Conveners: Matthew Stukel (Queen's University), Leon Pickard
      • 21
        Analysis of Alpha Particle Quenching in SNO+ Scintillator Using Internal Polonium Backgrounds

        The SNO+ experiment is a multipurpose neutrino detector located 2 km underground at SNOLAB in Sudbury, Ontario. The goal of the experiment is to search for neutrinoless double beta decay ($0\nu\beta\beta$) in liquid scintillator loaded with $^{130}\text{Te}$ in a low-background environment, thus necessitating the study of radioactive background contaminants. A sound understanding of the backgrounds can greatly improve calibrations and characterization of the light yield in the detector by comparing the light response to the results produced in simulation. One of the aspects of characterizing the light yield of the detector is the study of quenching sources, which refers to any process that reduces the quantum light yield of the scintillator. In this talk, analysis of $\alpha$-particle quenching in scintillator from the decays of $^{210}\text{Po}$, $^{212}\text{Po}$ and $^{214}\text{Po}$ is presented. Selections are made to data and Monte Carlo (MC) simulation based on detector geometry, scintillator levels in the partial fill phase and timing characteristics of the decays. The emission spectra in both data and simulation can be modelled effectively with exponentially modified Gaussian distributions due to the mono-energetic nature of the decays. Taking the mean value of the models to be the emitted energy of the decays, these parameters can be compared to their respective Q-values. The goal of this analysis is to fit an integrated light yield curve through the points to obtain the Birks' parameters $S$ and $k_{B}$.

        Speaker: Anthony Allega (Queen's University)
      • 22
        Simulating Gamma Ray Interactions to Determine Dead Layer of Germanium

        In this presentation, I will talk about my work simulating gamma ray interactions in a germanium detector. My aim was to determine the dead layer of the germanium detector. In the germanium crystal, there is a layer of lithium ions that extends some distance into the crystal. Any interactions in this layer are inactive and are thus not useful for detection of gamma rays. Calculating the dead layer will provide a more accurate picture of what interactions occur in active germanium. To do this, I needed to create a model of the detector. I used a program called Geant4, which is used in particle physics applications. I used a simpler application of G4 called g4simple that uses geometry description markup language (GDML). The model was created to be as close to the actual setup of the germanium detector as possible. Using this model, simulations were run with energies corresponding to the gamma decay of Ba133. The ratio of the events in the peaks was recorded for different values of dead layer (distance from edge of germanium). Data from the detector could then be collected and analyzed to determine the dead layer of the detector based on the ratio of events in peaks that corresponded to a specific dead layer value in the simulation.

        Speaker: Erin Bolger
      • 23
        Recent development of the radiopurity.org materials database

        The radiopurity.org database has proven to be a valuable resource for the low background physics community as a tool to track and share assay results. This talk will describe recent collaborative efforts between SNOLAB and the Pacific Northwest National Laboratory to modernize the database for the community. The new version includes all features of the original interface, along with additional features such as facilitated inserting and updating of data, automatic conversion of units of displayed results, and restructured user authentication. Discussed topics will include the features that I have been responsible for implementing for the site, as well as other completed projects such as a version of the radiopurity.org site exclusively for screening results taken with the XIA UltraLo1800 alpha counter.

        Speaker: Vicente Garrido (SNOLAB)
      • 24
        Radon Measurements and the Lucas Cell System

        Radon-222 is a naturally occurring radioactive gas originating from the Uranium-238 decay chain. Its presence is problematic in highly sensitive, low-background experiments such as dark matter or neutrino-less double beta decay searches, as it can mask events of interest. To evaluate Radon-222 levels in the SNO+ experiment, volumes from the cover gas system are passed through the radon board, where Radon atoms are subsequently trapped and transferred to a Lucas Cell counter. Putting the Lucas Cell on a photomultiplier tube and applying a voltage allows for the counting of alpha pulses and subsequently, the determination of the number of radon atoms. To improve the accuracy of results, several components of this system require testing. In this talk, I will present the efficiency measurements of the radon board, the Lucas Cell background levels, and various updates made to the entire data collection process.

        Speaker: Julia Azzi
      • 25
        Characterization of ion plumes in laser ablation ion sources

        Neutrinoless double beta decay (0νββ) is an interaction forbidden by the Standard Model because of its lepton-number violating properties. However, if 0νββ was to be observed, the neutrino must be its own anti-particle and the decay would demonstrate lepton number violation. There are several groups searching for this rare decay as it would imply new and exciting physics beyond the Standard Model of particle physics. The nEXO collaboration is one of them and developing a multi-ton scale time-projection chamber. This nEXO detector is anticipated to be sensitive to half-lives on the order of $1\times10^{28}$ years.

        A unique advantage of using a Xe time-projection chamber (TPC) detector is the possibility to locate the decay within the detector volume, and to extract into vacuum and identify Ba-136, the ββ-decay daughter of Xe-136. This so-called tagging allows one to dramatically reduce the background of the measurement to virtually zero, except from contributions of the allowed 2νββ. Ba-tagging is being developed as a potential upgrade of the nEXO detector.

        The focus of the development is the extraction of individual Ba-ions from xenon, i.e. the extraction of one ion from mols of xenon. This challenging task is being tackled by the development of various techniques within the nEXO collaboration. An in-xenon ion source will be required to characterize the extraction process and efficiency.

        To this end, I am virtually working with the Ba-Tagging group at McGill University to develop an In Gas Laser Ablation Source (IGLAS). The goal of this source is to provide ions for test of the Ba-Tagging extraction apparatus for the nEXO experiment upgrade. The source would provide Ba ions for extraction and identification tests. I am working with the group to acquire images of the ion plume using a high speed camera, from vacuum pressure to high-pressure xenon gas. Images will be acquired with a Vimba python program. Code is developed to trigger the camera and start measurements when ablation occurs. The software ImageJ is used to analyze these images and extract dynamical parameters of the ion plume, including plume length, plume angle, and most importantly expansion velocity. These algorithms will be used to better understand the characteristics of the ion plume, and will eventually be used to compute the expansion velocity of the ion plume which is important for the extraction of Ba from the plume. I will present the current status of the work, which includes the algorithms and how to extract the dynamical parameters.

        Speaker: Joseph Torsiello
      • 26
        The Evolution of Fundamental Galaxy Scaling Relations

        The purpose of this project is to investigate the structure of spiral galaxies using numerical simulations. We specifically explored the evolution of fundamental galaxy scaling relations, such as the Tully-Fisher relation (TFR), over the lifetime of a galaxy. The TFR is a linear relationship between the baryonic or stellar mass of spiral galaxies and their circular velocity. This project utilized two specific TFRs: the baryonic TFR (BTFR) which relates the baryonic mass of a galaxy to its circular velocity, and the stellar TFR (STFR) which compares the stellar mass of a galaxy to its circular velocity. While the TFR can provide valuable structural information about a galaxy (e.g., its distance), its underpinnings are still elusive and hold deep secrets about the way that baryons and dark matter impact each other. In order to examine how the TFR evolves with time, a hydrodynamical simulation of galaxy formation and cosmic structure was required. The Numerical Investigation of a Hundred Astrophysical Objects (NIHAO) simulation was used to the effect. The exact spatially-resolved definition of structural parameters also affects the final TFR and how it evolves with time. Therefore with NIHAO, we can better define TFR scaling parameters for fully-evolved systems (current time) and track their evolution with time. Four locations within a galaxy were tested for the measurement of TFR velocities at: 1) physical radii, 2) arbitrary (e.g. maximum) velocities, 3) percentages of the virial radius, and 4) isodensity levels. For each of these categories, four specific locations within the galaxy were tested, resulting in a total of sixteen test locations where to measure TFRs. After investigating the TFRs produced at these test locations for fully evolved ("redshift zero”) galaxies, three locations were selected to track the TFRs: an isodensity level and two different velocities. The isodensity level chosen was $1$ $M_{\odot}pc^{-2}$, and the velocities chosen were the velocity of the final data point for each galaxy and the median velocity between the effective radius of the galaxy and the last data point for each galaxy. Data analysis to track the evolution of the slopes and scatters of the TFRs at each of these three finalized locations is being applied over the full redshift range (0-3) of the NIHAO simulations. A project of this nature, to understand the detailed evolution of the TFR parameters and how it may be connected to the growth of the galaxy’s stellar mass, dark matter mass, or both has never been presented before.

        Speaker: Katherine Frazer
  • Tuesday, 24 August
    • Session 6

      10+3 talks
      11:00pm ET Tuesday

      Conveners: Matthew Stukel (Queen's University), Leon Pickard
      • 27
        Using Machine/Deep Learning to Analyze Acoustic Data and Probe the Physical Development of Bubble Nucleation

        The PICO detector generates acoustic signals from the bubble nucleation caused by incident particles, such as alphas, neutrons, gammas, and potentially WIMPs. These signals contain properties that can be used to distinguish among different particles. The acoustic parameter (AP) is one such tool, it measures the magnitude of the acoustic signals at different frequencies, and can distinguish between nucleation sources at a precision up to 99%.

        The purpose of our experiment is to better understand the impact a nucleation source (alphas, neutron particles, gammas, etc.) might have on the physical growth of a bubble. To accomplish this, we would first use neural networks and decision trees to identify key features of the acoustic signals formed by bubble nucleation. Then, we would use these features to distinguish between bubbles produced by one particle source to another. Finally, after categorizing these signal data signatures, we expect to be able to model the acoustic signal produced by different nucleating particles.

        We will also be examining how bubbles produced in a superheated droplet detector (PICASSO) might grow in a different manner to bubbles produced in bubble chambers filled with superheated nucleation fluids (PICO).

        We use a neural network to verify the fitting of the classification tree model. A neural network is a collection of computer “neurons” which mimic the behaviour of the human brain. These neurons, as they are trained with certain data, will activate in specific orders that are recognized by the module to help categorize the incoming data. In our experiment, we would train the data to distinguish between alpha and neutron induced bubble events using the whole acoustic signals. We expect to use this trained algorithm to confirm the accuracy of the classification tree module produced by Megan McArthur (also presenting a talk at CASST 2021). This classification tree module would then be used to highlight the bubble growth patterns of the trigger particles, which will allow us to explore the processes of bubble growth.

        Speaker: Mr Kyle Yeates (SNOLAB/Laurentian University)
      • 28
        Geometries and PMT Configurations for nEXO Outer Detector Simulations

        The nEXO experiment pursues the goal of observing or verifying neutrinoless double beta decay by placing 5000kg of liquid xenon enriched to 136Xe within a cryostat, and limiting backgrounds as much as possible. To limit the cosmogenic backgrounds, the cryostat is immersed in a large water tank several kilometres underground. Despite these efforts, cosmic muons can still cause reasonable background at this depth, therefore it is important to be able to identify which muons pass close enough to cause detection issues. Some of this can be achieved by tagging cosmic muons passing through the water tank using the ~125 photomultiplier tubes (PMTs) in order to register the Cherenkov radiation caused by the muons. An important step in building the outer detector (OD) is using simulations to optimize the geometry of these PMT's. This process includes using Fusion 360 to design both realistic and simplified geometry files of various PMT arrangements that are then used in Chroma ray tracing simulations to test their efficiency. In this presentation, we will show these geometries, along with some of the processes and results of the simulations run using them.

        Speaker: Liam Retty (Laurentian University)
      • 29
        Measuring the Mass of the Milky Way: Converting a 350 PHYS Project into a 250 Lab

        Every year the PHYS 350 students design and run an experiment themselves. It is usually disassembled soon after, but this year, we wanted to give the second-year students a look at what they could be doing in their next year, as well as providing them a chance to do an astronomy-based experiment in PHYS 250. We choose an experiment on measuring the mass of the Milky Way that published their paper on the topic. This lab allows the second-year students to learn basic astronomy practices as well as data analysis and critical reading in scientific settings.

        Currently, in PHYS 250 all students are given a list of 3-hour experiments to choose from. This new Milky Way experiment will be an option for two groups in tandem. We split the lab into two weeks so there is more time for planning and completing the experiment.

        The main objective for this lab is to measure radio frequencies coming off hydrogen clouds in the bands of the Milky Way, and through rotation calculations and Doppler effect, determine the size and mass of the Milky Way. Most of the equipment needed for this experiment was built by the PHYS 350 students in 2020-21, but the code for the GNU radio was modified to create a binary file output of any signal data gathered from the antenna. This new code made the error analysis much quicker and simpler. Data can now be analyzed in any Python environment by converting the binary to a numpy array and then preforming needed calculations.

        We conducted a mock lab by running the experiment with different members of our group with varying physics and coding backgrounds. We were able to create a question plan that would lead students in the right direction for their observations while also making sure they employ critical thinking and research skills. In the first week the students will run through various questions about astronomy and how the environment will affect their data as well as planning their observations for the next week with the hardware they want to use. The next week they will be able to perform their signal measurements as well as begin their data analysis.

        Speaker: Madelynn Mast (Queen's University)
      • 30
        Logical Muon Generation for Monte Carlo Simulations (Underground)

        The main purpose of building a particle detector underground is to minimize cosmic ray induced backgrounds, however, even the low flux of residual cosmic rays can constitute an appreciable source. The high sensitivity required to detect rare nuclear decay events or other sought after interactions can exaggerate these effects. Through-going cosmogenic muons contribute to background sources for many underground detection experiments including nEXO. There are possible muonic interactions from which erroneous signals arise. Muons may also cause delayed interactions within the detector from their interactions in surrounding rock. One way to mitigate this problem is to build a secondary (auxiliary) detector to tag muons in order to discard unwanted signals in the main detector.

        One of such detectors is being designed to work with the nEXO time projection chamber (TPC) tagging muons that pass through the water tank where the TPC is installed. This introduces new work in optimization through simulations and design alterations of an entirely distinct system. This presentation seeks to briefly mention the mechanisms by which muons affect nEXO detector operations and the rudimentary functioning of the water-Cerenkov detector. On this basis, an approach to logically generate such muons will be shown. Many of the relations to be presented have been used in Monte Carlo simulations previously for underground experiments, some of them at SNOLAB. While the underground energy spectra and zenith angle intensity are normally addressed in muon veto simulations, there is a lack of clarity with regard to the zenith angle dependence of the muon energy. This will be featured with an example of how to implement this in code. While the focus will be held on SNOLAB, particularly with its flat overburden, topics discussed can be applicable to other underground laboratories.

        Speaker: Regan Ross
      • 13:45
        Break
    • Session 7

      10+3 talks
      12:30pm ET Tuesday

      Conveners: Matthew Stukel (Queen's University), Leon Pickard
      • 31
        Identification of a Directional Cherenkov Signal in Tl208 Decays in the SNO+ Detector

        The identification of a directional signal from the Cherenkov photons generated during Tl208 decays in the SNO+ detector will be discussed in this presentation. The SNO+ experiment aims to study neutrino properties using a target of 780 tonnes of organic liquid scintillator. Liquid scintillator produces light when excited by the passage of charged particles. The isotropic light, which dominates the signal, can be used to determine a number of properties of the particles, but cannot be used to determine the direction of the primary particle. Cherenkov light, which is present as a sub-dominant component of the signal, is directional. Therefore, this analysis investigates the possibility of separating the Cherenkov signal from the scintillation light. The analysis uses Tl208 decays which are a background in the SNO+ detector. The angle between the high energy gamma released during Tl208 decays and the photons generated in the event was analyzed. The angular distributions of the data and Monte Carlo simulations with Cherenkov effects included and excluded were compared.

        Speaker: Maeve Cockshutt (SNO+)
      • 32
        Improving Background Simulation Efficiency with Flux Surfaces

        The Cryogenic Underground Test Facility (CUTE) is operational at SNOLAB. It is currently testing cryogenic silicon and germanium detectors which will be deployed in the SuperCDMS SNOLAB experiment. Given the small interaction rate of possible dark matter particles with ordinary matter, the radioactive backgrounds due to the lab cavern and other materials surrounding and part of the CUTE facility must be carefully characterized. One of the main tools for background studies is extensive simulation work using the Geant4 toolkit.

        Over the summer term, I focused on implementing a strategy to speed up the background simulations. The innovation of the project relies on the creation and propagation of decay products from the environment and shielding through the CUTE facility, down to a closed surface encompassing the detectors. At this surface, a flux counter records the flux of incoming particles per source component and contaminant. In the next simulation stage, the particle fluxes are propagated onto the detector stack. While this latter simulation stage will be repeated whenever the detector payload is changed, the previous simulations will be done once and reused many times. This presentation outlines an initial version of the simulation pipeline, as well as some preliminary results that can be used to estimate the biases introduced by this approach.

        Speaker: Alexander Pleava (McMaster University)
      • 33
        NEWS-G Sensor Characterization

        NEWS-G’s spherical proportional counters feature an 11-anode sensor at their centre. The sensor has two channels through which voltage can be applied, with the north channel serving the 5 northernmost anodes, and the south channel serving the southernmost 6 anodes. The electric field produced in the detector is asymmetrical (due to the rod that connects the sensor to the outer part of the vessel) and thus requires characterization to correctly interpret dark matter search data.

        My research this summer has been focused on sensor characterization. I have used different energy sources (Fe-55, Ar-37) and have varied the gas mixture in the detector as well to study the relationship between the voltage applied to the sensor and the resulting signal amplitude across different experimental setups. The goal of this project is to determine a voltage that will reliably yield equal signal amplitude in both channels of the sensor.

        Speaker: Irina Babayan (Queen's University)
      • 34
        In-gas laser ablation source (IGLAS) for nEXO's Ba-tagging group at McGill

        The nEXO collaboration is searching for neutrinoless double-beta decay (0νββ) events in a multi-ton Time Projection Chamber (TPC) filled with liquid Xenon-136 as it decays into Barium-136 ions. As an upgrade, the collaboration is also developing a barium tagging technique to extract Ba-136 from liquid xenon and identify it to eliminate all other background events. To further understand the production and extraction of barium ions in detector-like conditions, progress is being made towards the development of an in-gas laser ablation source (IGLAS). This setup focuses and guides a laser beam onto the surface of a barium sample inside a high pressure noble gas chamber. The laser ablates ions off from the sample surface, which are measured in the form of an ion current. The goal of the summer 2021 IGLAS project is to study ion current signals as a function of different parameters including, but not limited to, buffer gas, pressure, voltage applied to barium target and ion collector, and laser properties such as power and frequency rate. This presentation includes further details on the current status and development of the ablation source at McGill’s Ba-tagging group.

        Speaker: Laura Gonzalez Escudero (McGill University)
      • 35
        Building a Public Database for Transient High-Energy Event Analyses

        With an increasing number of telescopes and detectors, we now possess a wealth of information about various transient events. One such event of note was the neutrino alert from the IceCube observatory which led to multi-messenger observation of the blazar TXS0506+056. While the data and methods used to perform analysis of these transient events is publicly available, the analysis can be time-consuming. By building a pipeline which automatically performs this analysis and sending the results to a database, the results will be more easily accessible by both researchers and the public. Through learning the Fermi-LAT analysis process we have prepared to create this pipeline to populate a database. I will present the current status of the project.

        Speaker: Tai Withers (Queen's University)
    • Session 8

      10+3 talks
      3:00pm ET Start

      Conveners: Amanda Bacon (University of Pennsylvania), Leon Pickard
      • 36
        Simulations of PMT and Track Reconstruction in Magnetic Field

        This summer, I am working on EMPHATIC (Experiment to Measure the Production of Hadrons At a Testbeam) project under Dr. Blair Jamieson supervision. EMPHATIC is a proposed experiment to measure hadron scattering and production cross sections for improved neutrino flux predictions, its aim is to provide more complete data to reduce the neutrino flux uncertainty measurements. In this project, my responsibilities are working with PMT, simulating the complete circuit consisting of PMT and Padiwa board using LTspice, and to do track reconstruction in the magnetic field. We wanted to check if the output signal from the full circuit will be big enough for the discriminator, which is an LVDS buffer on an FPGA. We simulated a circuit consisting of PMT with the components on the Padiwa board in LTspice and tested its output. The Padiwa electronics board will play a big role in this experiment as it will process and amplify the electric pulse from the PMT, so the pulse should be big enough for the discriminator. We compared the data recorded in the laboratory with the simulated output signal. The output voltage was found to be lower than the simulated one, but the experimental graph matched the simulated graph, and the signal was found to be big enough for the discriminator. For the coming EMPHATIC experiment, we are using a permanent magnet, to better identify the particle trajectories. It is necessary to ensure that the track reconstruction algorithm is in closest agreement with the real set up of the detector plates. In 2018 EMPHATIC experiment the permanent magnet was not included, so the particle trajectories were straight lines. Therefore, the Monte Carlo studies need to consider this absolute alignment of the detector plates, with feedback from the data, in order to achieve higher accuracy in momentum reconstruction of the particles. Efforts are ongoing in this perspective to understand and modify the reconstruction algorithm accordingly.

        Speaker: Kishankumar Patel (The University of Winnipeg)
      • 37
        Black hole quasinormal mode spectroscopy

        The study of the evolution of small perturbations in black hole spacetimes, which is surprisingly related to the scattering problems in quantum mechanics, has played a fundamental role in gravitational-wave astronomy. Characteristic modes of vibrations are ubiquitous in nature, and it is expected that these characteristic frequencies are also associated with black holes. Similar to how the normal frequencies of various musical instruments carry information about the nature of the instrument, the characteristic frequencies associated with black holes also carry information about the nature of black holes, i.e., their charge, mass, and angular momentum. Due to the presence of an event horizon, these modes are complex frequencies, and the perturbed black hole system, just like any real-world physical system, becomes dissipative. To understand this phenomenon, I will talk about the equation governing the behavior of a massless scalar field in a Schwarzschild background and introduce the idea of quasinormal modes (QNMs). By using the Wentzel-Kramers-Brillouin (WKB) method, I will provide a detailed analysis of the QNM frequencies and discuss the intuitive relationship between characteristic modes of black holes and null circular orbits. I will show that for all spherically symmetric spacetime backgrounds, in the eikonal limit, damping time associated with QNMs can be indirectly obtained through the instability time scale of a null circular orbit given by the principal Lyapunov exponent. Further, by computing QNMs by Leaver's method, I will relate them to exciting developments in the detection of gravitational waves and the stability of black holes.

        Speaker: Ashley Chraya (IISER Mohali)
      • 38
        In-Gas Laser Ablation Source (IGLAS) Development at McGill University

        The Ba-tagging technique will be an upgrade to the nEXO detector that will allow it to extract and identify Ba ions from double beta decay events to help eliminate all background events. The Ba-tagging group at McGill University have previously worked on the Laser Ablation Source (LAS) for ion extraction in vacuum. Currently, the group is making progress towards developing the In-Gas Laser Ablation Source (IGLAS) for further studies in the production and extraction of ions in a controlled gaseous environment. The experiment will also be conducted with various metal targets and gases at high pressures with the goal of eventually ablating a Ba target in high pressure Xe gas.

        Speaker: Minya Bai
      • 39
        Distinguishing signal from background for a direct measurement of antihydrogen’s Lamb shift

        It is predicted that after the Big Bang equal amounts of matter and antimatter should have been created, however the universe is dominated by matter and there is much less of its counterpart. Antihydrogen is created and analyzed by the ALPHA (Antihydrogen Laser Physics Apparatus) collaboration at CERN to look for asymmetries by comparing its spectra with hydrogen’s. The Lamb shift is an important transition in hydrogen, it is traditionally defined as the splitting of the 2S1/2 and 2P1/2 states at zero magnetic field. Presently it has been only indirectly measured in antihydrogen using data from two separate laser spectroscopy experiments. ALPHA’s goal is to make a more precise direct measurement of this transition, where contrasting measurements of the Lamb shift in hydrogen and antihydrogen will provide insight as to the differences between the two. To measure this, a trapped antihydrogen atom must be excited from 1S (ground state) to 2S with a laser. Next, microwave radiation will be applied to cause a transition to 2P. The 2P state has a high probability of undergoing a positron spin flip transition to an untrapped state, resulting in the anti-atom annihilating on the surrounding apparatus walls. There are two types of annihilations occurring, those from a 2S-2P transition followed by a spin flip decay, the desired signal, and those from the ionization of the 2S state by the excitation laser, an undesired background. It is critical to distinguish between the two to complete a Lamb shift measurement. This is done by examining existing laser spectroscopy datasets categorized into those with and without ionizations. Those without ionizations are from experiments where the excitation was directly from 1S to 2P, followed by a spin flip decay down and detected annihilation. Contrasting these datasets allows ionization and spin flip annihilation position distribution variations to be determined. A statistical analysis of position distribution variations gives the information necessary to distinguish between the desired and undesired background signal. It also provides insight as to how various experimental modifications impact the location of annihilations. Knowing this is critical in determining the viability of a direct lamb shift measurement and the appropriate laser and microwave parameters. From here ALPHA can design an experiment to make the first direct measurement of this important transition in antihydrogen which could lead to a discovery in the matter/antimatter asymmetry problem.

        Speaker: Abbygale Grace Swadling (University of Calgary Dep. of Phys. and Astronomy (CA))
    • Session 9

      10+3 talks
      3:30pm ET Start

      Conveners: Amanda Bacon (University of Pennsylvania), Leon Pickard
      • 40
        Tellurium Plants and Tellurium Purification in SNO+

        SNO+ is a continuation of the original SNO experiment conducted in the late 1990s and early 2000s. SNO+ uses Linear Alkyl Benzene (LAB) as a liquid scintillator to detect neutrinos. Tellurium will be added to the scintillator to help search for neutrinoless double beta decay. In this presentation, I will focus on the underground purification and synthesis process of the tellurium complex.

        Speaker: Julia Patterson (SNOLAB)
      • 41
        A comparative study of various Deep Learning techniques for spatio-temporal Super-Resolution reconstruction of forced isotropic turbulent flows

        Super-resolution is an innovative technique that upscales the resolution of an image or a video and thus enables us to reconstruct high-fidelity images from low-resolution data. We deploy super-resolution analysis on turbulent flow fields spatially and temporally using various state-of-the-art machine learning techniques like ESPCN, ESRGAN and TecoGAN to reconstruct high-resolution flow fields from low-resolution flow field data, especially keeping in mind the need for low resource consumption and rapid results production/verification. The dataset used for this study is extracted from the ‘isotropic 1024 coarse’ dataset which is a part of Johns Hopkins Turbulence Databases (JHTDB). We have utilized pre-trained models and fine tuned them to our needs, so as to minimize the computational resources and the time required for the implementation of the super-resolution models. The advantages presented by this method far exceed the expectations and the outcomes of regular single structure models. The results obtained through these models are then compared using MSE, PSNR, SAM, VIF and SCC metrics in order to evaluate the upscaled results, find the balance between computational power and output quality, and then identify the most accurate and efficient model for spatial and temporal super-resolution of turbulent flow fields

        Speaker: T.S.Sachin Venkatesh (Delhi Technological University)
      • 42
        Simulation of backgrounds for the Ar2D2 detector.

        Liquid Argon is used in dark matter detectors, it’s biggest issue is the Argon 39 that exists inside liquid Argon.The Ar2D2 detector is going to responsible for detecting the Argon 39, to be as accurate as possible the detector also needs to be aware of the backgrounds.

        Speaker: Maria Ortiz
      • 43
        Cascade Reconstruction in Pacific Ocean Neutrino Experiment (P-ONE)

        The Pacific Ocean Neutrino Experiment (P-ONE) is a proposed multi-cubic kilometer neutrino telescope off the coast of Canada. It aims to probe physics at previously inaccessible regimes, and builds on the discoveries made by the IceCube Experiment, located at the South Pole. Some goals of a full-scale P-ONE are to confirm the relative flux for different neutrino flavors and their possible cosmic origins.

        I have explored the problem of analysing a single neutrino that enters the P-ONE detector and classifying that as either a muon-type neutrino or an electron-type/tau-type neutrino. If this is accurately accomplished, we will be able to solve the problem of determining neutrino fluxes.

        Neutrino flavors can be distinguished by how they interact with the medium inside the detector while producing its corresponding charged lepton. A muon neutrino gives a track-like signature while electron/tau neutrinos produce secondary hadronic showers called “cascades”. To combat an inherent risk of misclassification of track-like events as cascade-like events due to large stochastic energy losses, I proposed, designed and benchmarked a cascade-fitter algorithm that uses the maximum likelihood reconstruction approach.

        I used a three-step process for the cascade-fitter. First, the neutrino-nucleon interaction vertex is approximated by taking a weighted average of the charges collected by the detectors with a definite geometry. Second, I improved on the assumption that the Cerenkov emissions are isotropic, by incorporating a carefully-chosen anisotropy factor that depends on the relativistic velocity of the primary daughter particle. Finally, the parameters of the anisotropy function were obtained by fitting to data representing the angular distribution of DOM hits with respect to the primary lepton direction.

        For this aforementioned reconstruction approach, we find that the peak of the likelihood curve shifts towards the origin, in comparison to the isotropic case. This indicates that the fit is a lot better than the existing approach, and the high likelihood value also provides a strong confirmation of the fit. There is also an improvement in the resolution of the azimuthal and zenith angle estimations of the incoming electron or tau neutrino. We hope to improve the analysis and fit further to improve our understanding of neutrino physics.

        Speaker: Mr Kaustav Dutta (University of Alberta)
      • 44
        Right Time, Right Place: a Comparative Analysis of DEAP-3600’s Position Reconstruction Algorithms

        DEAP-3600 is a highly sensitive spherical direct-detection dark matter experiment. Scintillation light detected following subatomic particle interactions with liquid argon is used to classify events in the search for Weakly Interacting Massive Particles (WIMPs). As part of this classification, position reconstruction algorithms aim to pinpoint where the interaction took place. In this report, the performance of a new artificial intelligence algorithm is compared with DEAP-3600's existing algorithms for position reconstruction.

        Speaker: Raveen Sidhu (DEAP-3600)
    • 18:30
      Deliberation Period
    • 45
      Prize Announcement and Closing Remarks
      Speaker: Christine Kraus (Laurentian University)