- Compact style
- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
Registration for participants are now closed. Registration for viewers will remain open.
Day 2 Zoom: https://laurentian.zoom.us/j/93614635887?pwd=MUR1NjJyaDdzN3d3aUl5T2tJQ0tvdz09
Feel free to register as a viewer.
All summer students who are not currently engaged in graduate studies or have a graduate degree are encouraged to attend and submit an abstract. A final programme will be posted following registrations.
Talks will be 9 minutes, with a 2 minute question period. The final schedule is now available.
Two competitors will be granted an all-expenses-paid opportunity to present at the 2023 Canadian Association of Physicists (CAP) Congress in Fredericton, NB, the prestigious annual meeting of all Canadian physicists.
The competition is co-sponsored by SNOLAB and the McDonald Institute.
Executive Learning Centre (ELC)
Zoom Link (TBA)
How the Sun Shines
School teachers across Canada are always looking for new ways to enrich their classrooms and bring exciting science to their students, no matter the grade level. That is why I have created three versions of an astroparticle physics educational kit with provincial and territorial curriculum connections mapped across the country. The main topic is answering the question “how does the sun shine?” The three versions of the kit correspond to education levels; elementary, middle, and high school. The content and activities are also scaffolded within those levels to enhance the learning of all students in the classroom. The kits include a teacher’s resource with background information and suggested ways to implement the content, worksheets, group discussion prompts, and hands-on activity guides. Each grade explores the concepts of solar radiation, energy, nuclear fusion, and particle physics, with an option to explore the solar neutrino problem. The high school kit uses real data from the SNO experiment to illustrate the scientific process from recognizing a problem to discovery of the solution. This project brings the recent Ontario Nobel prize winner Arthur B. McDonald and his work to classrooms across Canada, offering a world class exploration into astroparticle physics , and ultimately inspiring the next generation of Canadian scientists.
NEWS-G’s dark matter detectors use spherical proportional counters (SPC) filled with light gases to detect low energy WIMPs through nuclear recoil. The quenching factor for each gas used is calibrated for WIMP detection. This can be done using low energy neutrons which interact via nuclear recoil in a similar way to the predicted WIMP interaction. An experiment using a neutron beam in the 10keV region at Reactor Materials Testing Laboratory at Queen’s University will be used to perform the calibration. We have conducted Monte Carlo simulations to measure how the neutron beam interacts in various materials to determine their output particles. The simulations allow us to determine the shielding required to remove the gamma background and the expected signal to noise ratio in the SPC.
Cosmic-ray muons are plentiful at sea level; about one per second passes through an outstretched hand. They are useful as calibration tools for people developing particle detectors but are a significant source of background for low-rate experiments and are one of the reasons that deep-underground facilities like SNOLAB exist. The muons are the decay products of charged mesons (mostly pions) produced in the cascades of particles produced when high-energy cosmic rays (mostly protons and helium nuclei) impact the upper atmosphere. The cascades are known as extensive air showers. One can learn about these showers by measuring the lateral distribution of muons, and this can be done by recording the coincidence rate as a function of the separation of two scintillation counters. In this talk, I will describe such a measurement.
The DEAP experiment, set at SNOLAB, 2km underground (6.011 km.w.e.), is designed to search for Weakly Interacting Massive Particles (WIMPs), one of the most promising dark matter candidates. The experiment consists of a target of 3.3 tons of liquid Argon contained in an acrylic spherical vessel. The target is surrounded by a cylindric tank filled with ultra-pure water, which allows for the rejection of muons. In addition, the water tank can shield and discriminate neutrons from the environment that are created by the spallation of muons in the surrounding rocks. These are rejected thanks to the muon signals in the water.
The only measurement of the muon flux at SNOLAB was performed by the SNO experiment, down to 3.31e-10 mu/s/cm^2. The DEAP experiment is performing an independent measurement of the muon flux needed for the background evaluation of the upcoming dark matter searches.
A PyROOT-based Monte Carlo simulation was developed to evaluate the expected rate in the DEAP experiment starting from the Mei et Hime model, which describes the differential intensity of muons for underground experiments. The preliminary effective area of the DEAP muon veto is found to be A_eff = 63.5 +- 1.9m², which points to a rate of R_mu = 18.16 +0.59 -0.57 mu/day, assuming the SNO flux.
It has been suggested that ultra diffuse galaxy AGC 114905 contains little to no dark matter [1], a claim that stems from a 3D kinematic analysis of atomic gas maps of the system. My analysis aims to study the reliability of 3D-BAROLO, the software used to derive such claims, in the low-mass and low-resolution regime to look for sources of uncertainty within this calculation. By producing mock observations of ultra diffuse galaxies, the accuracy of the rotation curve, velocity dispersion profile, and surface density profile produced by 3D-BAROLO were quantified. It was found that the reliability of the 3D-BAROLO fitting algorithm is heavily dependent on the inclination value assumed in the models, implying that there could be dark matter in AGC 114905 if its galactic inclination is overestimated in ref. [1]. In addition, my analysis suggests that the velocity dispersion and surface density profile amplitudes can be systematically underestimated by 3D-BAROLO when it is applied to AGC 114905-like observations, which could alter the no-dark matter interpretation for this system.
Reference:
[1] P. E. M. Pi ̃n a, F. Fraternali, T. Oosterloo, E. A. K. Adams, K. A. Oman, and L. Leisman, “No need for dark matter: resolved kinematics of the ultra-diffuse galaxy AGC 114905,” Monthly Notices of the Royal Astronomical Society, vol. 512, no. 3, pp. 3230–3242, dec 2021. [Online]. Available: https://doi.org/10.1093%2Fmnras%2Fstab3491
The light-only liquid xenon (LoLX) experiment is designed to study light emission, transport, and detection in liquid xenon (LXe) detectors using silicon photomultipliers (SiPMs). LoLX consists of 96 Hamamatsu VUV4 SiPMs arranged in a cylindrical geometry and submerged in liquid xenon. This R&D detector is used to investigate the timing structure of light production processes like scintillation and Cherenkov light in LXe, as well as to provide better understanding of SiPM external cross-talk between neighboring SiPMs and its effect on the overall detector performance. SiPM external cross-talk refers to the emission of secondary photons during the avalanche process resulting from photodetection on a SiPM cell; these can reach other SiPMs and may produce correlated hits on nearby devices. Characterizing the SiPM pulse shape and correlated noise contributions allows for accurate and reliable reconstruction of photons, which is needed to improve the energy and timing resolution of our response model for photon detection. To reconstruct photon signals, we have developed an improved pulse-fitting algorithm that constructs a functional form of the pulse shape. I will present on the functioning of the fitter, its performance, and compare it to other photon-counting algorithms, in particular to a traditional pulse-finding algorithm with respect to improving energy resolution.
The Ar2D2 detector is designed to detect Ar39 levels in liquid argon. To determine if the detector would be usable for this purpose, RAT was used to simulate the background levels of radiation generated by each layer of the detector, as well as by the external lab. From the results, the main source of background radiation in the energy range of interest will be identified, along with any potential changes to the design that would mitigate this background.
The black holes solution of the Lovelock theory in five-dimensional spacetime plays an important role in higher dimensional theories of gravity. In five dimensions there are two orders of Lovelock Gravity, where the first order of Lovelock gravity is equivalent to Einstein's theory of gravity. The second order of Lovelock gravity is equivalent to the Einstein-Gauss-Bonnet (EGB) theory, including the quadratic curvature correction of Einstein's theory with a non-vanishing coupling constant $\alpha_2$ known as the Gauss-Bonnet term. The corresponding Lovelock black holes are formulated in asymptotically flat spacetime in this work. The Lovelock black hole is formulated by finding metric solutions of the first and second-order field equation of the Lovelock theory with a static spherically symmetric ansatz. We also study the charged black holes in a spherically symmetric electric field. The first-order Lovelock black hole is equivalent to the five-dimensional Schwarzschild-Tangherlini solution.
In contrast, the first order charged Lovelock black hole is equivalent to the five-dimensional Reissner-Nordström solution. The second-order Lovelock black hole is equivalent to the neutral Einstein-Gauss-Bonnet solution, and the second-order charged lovelock black hole is equivalent to the charged Einstein-Gauss-Bonnet solution. Lastly, the property of the solution is studied by variating its parameters (in terms of mass, charge, and coupling constant).
The Pacific Ocean Neutrino Experiment (P-ONE) is a multi-cubic kilometre neutrino observatory in development off of British Columbia’s west coast, 2600 metres below sea level. P-ONE detects Cherenkov radiation from secondary particles of neutrino interactions in ocean water to ultimately study astrophysical neutrinos. The observatory is comprised of spherical modules with outward-facing PMTs held along kilometre-long mooring lines rising from the ocean floor. Since this dynamical system freely sways in the ocean, a fraction of the optical modules (P-OM) are calibration modules (P-CAL) monitoring the detector by calibrating its geometry and optical properties using nano-second light pulses and an acoustic system.
This talk outlines and analyzes the characterization of currents and the resultant bending in the Cascadia Basin, with emphasis on developing a calibration baseline for detector geometry and optical properties. By analyzing on-site ocean current data over consecutive years and applying marine dynamics simulations, I place bounds on expected node movements. These are used in Markov Chain Monte Carlo simulations to estimate geometry calibration precision from photon arrival times. This works in tandem with ongoing prototyping and construction of an apparatus to calibrate light flashers to meet precision requirements. These results are used to design calibration modules for the first neutrino detector string: P-ONE-1.
The Scintillating Bubble Chamber experiment (SBC) at Queen’s requires a balance between extremes of temperature and pressure, and sensitive detection equipment that needs to be protected from damage as well as background signals throughout the operation of the experiment. The active volume of SBC is cooled by a cryocooler to 90K (-180°C) within synthetic quartz jars that are sensitive to cracking if the cooling process is uneven or too fast. Ensuring the robustness of the cooling systems has been at the heart of our work this summer with temperature detectors and cryo-cooler vibration analysis. This motivates our ongoing work developing a system of resistance temperature detectors. The cryocooler introduces vibrations to a system sensitive to any kind of energy input, so we performed a characterization of those vibrations to confirm that they are not at risk of causing damage to equipment or nucleating bubbles in the active volume. Going forward, these projects will help ensure that SBC is kept safe and minimize backgrounds in the experiment.
Lecture demonstrations are an essential part of physics education, allowing students to foster a more intuitive and meaningful grasp of key scientific concepts. As particle detectors, cloud chambers have the unique property of allowing for real-time, naked-eye visualization of ionizing radiation through the production of a condensate trail in a super-saturated alcohol atmosphere. For this reason, the cloud chamber is an excellent learning aid in entry-level physics courses and outreach purposes, capturing students’ attention and imagination. The goal of the project is to construct an inexpensive, standalone cloud chamber using off-the-shelf parts and easily available tools. To accomplish this, a two-stage thermoelectric cell was used to cool the chamber, housed within a 3D printed frame. The device is portable and powered by a computer power supply. This presentation will detail the construction and design of the chamber and outline the learning goals for the chamber as an outreach tool. A complete and open-source build guide will be available for individuals seeking to construct one of their own.
DEAP-3600 is a large liquid argon detector located 2 km underneath Earth's surface at SNOLAB. DEAP-3600 searches for evidence of dark matter in the form of weakly interacting massive particles (WIMPs), via direct detection. The detector is a spherical acrylic vessel containing 3.3 tonnes of liquid argon. A typical event in DEAP-3600 begins with a nuclear recoil or electronic recoil: this results in UV scintillation light that is then wavelength-shifted to visible, and detected using photomultiplier tubes (PMTs). To discern WIMP-candidate events from background events, an event selection has been implemented in data analysis. Some of these event selection cuts are designed to isolate background events that come from alpha decays in the neck region of DEAP-3600, from potential WIMP-candidate events originating in the spherical vessel. A re-optimization of these cuts against neck alpha backgrounds will be presented, that will improve the dark matter sensitivity by maintaining the same WIMP signal efficiency, while reducing background leakage.
Some beyond-standard-model theories, such as the Axi-Higgs model (Leo WH Fung, 2021), suggest the existence of exotic light scalar fields that couple to matter. During high-energy astrophysical events such as binary neutron star (BNS) mergers, these scalar fields may be emitted as radiation. Our project proposes a novel method to detect such radiations, namely, using the Global Positioning System (GPS). GPS satellites have atomic clocks onboard to correct for relativistic effects, so they form a quantum sensor array around the globe, which is a potential facility available for astronomical observations. The interactions between the emitted light scalar field and the atomic clocks can generate effective changes in fundamental constants, such as the electron mass and the fine-structure constant, imprinting measurable signals in atomic clocks. Therefore, we may be able to detect the signal of light scalar radiation by utilizing satellite data. This method has a few advantages. First of all, the facility it utilizes already exists, meaning that there is no need to build expensive new apparatus. Secondly, our previous calculations (Conner Dailey, 2020) show that other forms of signal, such as the gravitational wave signal, are shielded from the atomic clocks on the GPS, thanks to their low sampling rates. Moreover, since about 20 years’ worth of GPS data is available in the database, we can trace back in time to search for low-mass scalar field bursts, by correlating to LIGO data or short gamma-ray bursts.
In our previous paper (Conner Dailey, 2020), we discussed the possibility of utilizing the GPS to detect a monochromic signal of light scalar fields. In an actual BNS event, rather than a monochromic signal, we would expect an emission waveform closely related to the inspiral of the BNS, which is then modified by propagation effects. By considering the quadrupole radiation, we derive an expression of the waveform, and hence an expression for the spectral density (up to a multiplicative constant, which depends on the strength of the coupling). Also, by utilizing sample satellite data, we analyzed the background noise in the satellite data. From that, we can calculate the signal-to-noise ratio as a function of frequency. This would tell us which part of the signal is visible to the detectors, and how strong the coupling should be to let the signal be visible. Another interesting question is how well the timing information from the GPS network can be used to localize the event in the sky.
References
Conner Dailey, C. B., et al. (2020). Quantum sensor networks as exotic field telescopes for multi-messenger astronomy. Nature Astronomy, https://www.nature.com/articles/s41550-020-01242-7.
Leo WH Fung, L. L.-C.-H., et al. (2021). Axi-Higgs Cosmology. https://arxiv.org/abs/2102.11257.
The Schrodinger Equation (SE) is not just for quantum mechanics; it can describe galactic Dark Matter (DM) too. Fuzzy Dark Matter (FDM) is such a hypothetical model. It comes in the form of a particle so light that its de Broglie wavelength is light years long. The distribution of FDM thus behaves like a density wave, following the SE on the galactic and cosmological scales. Recent simulations suggest that FDM would cause galaxies to evaporate over time, spitting stars out of orbit. However, these simulations are not fully self-consistent, and we believe that these findings should be verified. That is, a system comprised of FDM and particles should reside and evolve in the gravitational potential produced by both the FDM and particles. This talk presents: the background information on FDM, the argument for its elimination as a DM candidate, and preliminary results of our very own fully self-consistent-simulations of gravitational interactions between FDM and particles, in a 1D model of a galaxy.
SNO+ is a liquid organic scintillator detector aiming to study neutrinos, which is now completely full of scintillator with the addition of wavelength shifter having been completed.
Within SNO+, there are particles that get tagged as muons by two main criteria. They have large nHit values (the number of hit PMTs per event) and a high number of OWL (OutWard Looking PMT) hits. When a muon is detected, livetime is lost due to the fact that an event occurred. For each muon that occurs, 20s of time is lost. It is worthwhile to look at these so called “muon events”, because they can help to get a better understanding of the detector. It can also lead to explanations as to what some of these muon events actually are.
The nEXO experiment is being designed to search for neutrino-less double beta decay ($0\nu\beta\beta$) in a 5000 kg liquid xenon time projection chamber (TPC) enriched to the isotope xenon-136. nEXO's > 10$^{28}$ year sensitivity reach to the $0\nu\beta\beta$ half-life requires extremely low backgrounds from external sources. Backgrounds are dealt with in part by surrounding the TPC with an outer detector (OD) in the form of a cylindrical water tank. The OD serves both to passively shield from incident particles like gammas and neutrons from local U and Th decays and also to actively veto cosmogenic backgrounds by detecting the Cherenkov light of passing muons using photomultiplier tubes (PMTs). These muons undergo spallation processes on local nuclei sending neutrons into the TPC and activating the xenon.
In this talk, we discuss the simulation of incident cosmic muons and the neutrons they induce. Following the precursory work of simple Monte Carlo muon simulations, FLUKA was deployed to simulate the muonic backgrounds with more comprehensive physics. The simulation techniques will be discussed along with preliminary simulation results.
The properties of low mass dark matter halos and subhalos, less than 10^9 solar masses, heavily depend on the particle nature of dark matter. Strong gravitational lensing provides a direct probe of these low mass halos. The relative brightnesses of lensed images (flux ratios) in quadruply-imaged quasars (quads) are sensitive to low-mass dark matter structure, and can be used to constrain dark matter theories. In particular, ultra-light dark matter (ULDM) refers to a class of theories, including ultra-light axions, with a particle mass as light as 10^-22 eV. These particles are so light that quantum mechanical effects can manifest on galactic scales in ULDM theories. First, quantum pressure between the ULDM particles leads to the suppression of small-scale structure. Second, wave-like interference patterns in the density profiles of ULDM halos cause large fluctuations in the dark matter mass density comparable to the de Broglie wavelength of the particle. I will present constraints on ULDM models, from 11 strong lenses, which account for both structure suppression and density fluctuations. I will then show that the fluctuations in ULDM can significantly impact flux ratios in quads, and therefore affect particle mass constraints.
Radon assays are an effective way to determine the amount of radon in a particular area or substance in question, and therefore determine the state of it. The assay technique that is currently being performed has been used for several years, and has further developed and improved to collect data for a large number of vessels past its original application. By performing assays, it allows us to specifically focus on $^{222}$Rn, given its abundance in an underground lab that is situated deep in a mine. Due to the necessity for extremely low backgrounds in the experiment, it is critical that the assay system is calibrated for these conditions in order to accurately reflect the state of the experiment. Performing an assay involves trapping the radon atoms in specially designed traps, cryogenically freezing them in order to concentrate the atoms, and heating them in order to encourage them to follow a path which will lead to a Lucas Cell. This Lucas Cell is clear, and is coated in ZnS, so that when the radon atoms are fully trapped and placed into a PMT, it is able to detect alpha particles that have decayed, thereby detecting the amount of radon in a known amount of gas. These assays are often repeated several times in order to reflect accurate results and determine if the amount of radon satisfies the limits that have been put in place. There were a number of very important assays that have been performed such as UI assays, UI assays connected to the Radon Monitor, and assays that were done on the newly built LN2 plant.
A mysterious excess in diffuse far-ultraviolet (FUV) background radiation was observed by the Galaxy Evolution Explorer (GALEX) orbiting telescope with bandwidth 1344–1786 Å. This radiation remains strong even at high galactic latitude where young blue stars, the only known source of UV photons, do not exist. Scattered light from UV sources in the galactic disk is also unable to account for this FUV background.
The novel Axion Quark Nugget (AQN) dark matter model may provide an interpretation for this as-yet-unexplained excess. This model proposes that dark matter consists entirely or partially of macroscopic composite objects of nuclear density. These nuggets exist in matter and antimatter variants, with approximately similar abundances. The antimatter nuggets are composed of a core made of antiquarks in the color superconducting phase, immersed in a positron-sphere which guarantees the near electrical neutrality of the nugget.
Baryons colliding with the antimatter AQNs can eventually annihilate, heating the positron-sphere, which, in turn, emits radiation very similar to Bremsstrahlung. The FUV part of the resulting emission may be responsible for the excess seen by the GALEX telescope in the Milky Way. The main contributors of baryons to this emission are not found in stellar matter, but rather in free protons from the ionized gas in the Warm Hot Intergalactic Medium (WHIM) surrounding the milky way, extending up to the virial radius. To verify this phenomenon, the spectral emissivity of the baryon–(anti)-AQN annihilation, along with the baryon and dark matter number densities, are used to create a sky map of the expected emitted FUV radiation in a Milky Way-like galaxy. This emission sky map is compared with maps of FUV flux magnitude from the GALEX telescope for matches in latitude and longitude.
Understanding the exact source(s) of this FUV excess may bring us a step closer to revealing the exact nature of dark matter.
Silicon Photomultipliers (SiPMs) are tightly-packed arrays of single-photon avalanche diodes (SPADs), biased above breakdown, that undergo a self-sustaining charge avalanche process upon absorption of an incident photon. Due to its compactness, high single-photon resolution, low-noise and ability for operation at cryogenic temperatures, the SiPM is emerging as a baseline photon sensing solution in a number of rare-event searches in physics, notably the planned nEXO neutrinoless double-beta decay experiment. An unfortunate byproduct of the avalanche process is the production of secondary photons. These can trigger avalanches in neighbouring SPADs, or leave the SiPM entirely and trigger a neighbouring sensor – this has a systematic effect on detector performance. The Microscope for Injection and Emission of Light (MIEL) is a custom setup developed at TRIUMF enabling the study of secondary photon emission in SiPMs, by stimulating a SPAD using a laser. The setup is used to view the light emission geographically on the SiPM surface, and obtain a spectral distribution for emitted photons.
NEWS-G develops spherical proportional counters (SPCs) for low-mass dark matter detection. SPCs are spherical detectors with a sensor at the centre that can be filled with various target gases. To effectively interpret detector data, the detector needs to be calibrated using various sources of known energy. Particularly for low-mass dark matter detection, aluminium x-rays can be used as a low-energy calibration source. The method devised at NEWS-G for generating aluminium x-rays is to wrap an americium-241 source in aluminium foil, as the alpha particles released by the americium will deposit energy in the foil and cause x-rays to be emitted. This talk will highlight the importance of detector calibration in dark matter detection and describe the process of optimizing the americium source’s aluminium foil thickness such that alpha particles are not released into the detector while also ensuring sufficient aluminium x-ray emission.
This presentation will cover all large projects, experiments, and research that I have completed throughout the entire summer. I will introduce the transition which includes a summary on previous summer student contributions to the same project, how I have continued the work, and made some changes to adapt to the situation at hand and make it my own of the guide tube design. Before construction of the guide tube, I completed an experiment, named "The Force Test" to attempt to determine the frictional forces acting between the copper guide tube and source cable. The results and future plans will be shared. The next topic is the creation and arrangement of the guide tube structure. A wooden frame was built to outline and hold the tubing, the planning and construction process will be delved into. The preparation, as well as the process of tube bending for the design will be explained, plus hardware used to secure the tubing to the wooden structure. Next, I will mention how the wooden structure plus tubing was moved and supported within D-140. The prototyping results from moving the source cable through the copper guide tubing design will be shared and will be the final piece discussed.
Asymmetric Dark Matter (ADM) could be trapped in the sun and other stars. In the past, two heat transport formalisms were developed to explain how this ADM could alter temperature gradients in stars, which could in turn alter key components such as neutrino flux, astero-seismological signatures, and so on. More recent articles have developed a recalibrated form of the transport formalism from Spergel & Press that would also allow the ADM cross sections to be momentum or velocity dependent. This new formalism has been tested for short “snapshots” of a star, but not in full stellar evolution simulations. My summer project is to implement this in a Fortran-based simulation code that can utilize these new formalisms to model theoretical effects of dark matter in stars that can later be compared to real life asteroseismology data.
The Scintillating Bubble Chamber (SBC) is a next generation dark matter detector that utilizes silicon photomultipliers to veto background electron recoil induced events. The silicon photomultipliers detect incoming scintillating photons, and the resulting pulses are analyzed through computational methods. Specifically, we contrasted traditional algorithmic methodologies with supervised machine learning approaches to determine which methodology yields the highest accuracy. Additionally, an unsupervised machine learning clustering algorithm was used to target single pulse events to boost current pulse analysis algorithms. All produced computational methods aim to reduce background when identifying dark matter related events.
Laurentian University Doran Planetarium
https://laurentian.zoom.us/j/93614635887?pwd=MUR1NjJyaDdzN3d3aUl5T2tJQ0tvdz09
SNO+ is a scintillator filled neutrino detector located 2 km underground at SNOLAB. The primary goal of the SNO+ experiment is to search for neutrino-less double beta decay (0vbb). The rarity of this phenomenon necessitates a high level of sensitivity making background analysis crucial. In this presentation, I will outline the methods used to find and characterize K40 backgrounds in SNO+—a signal that is notoriously difficult to measure given its statistical rarity. The methods used in this analysis exploit the structural symmetry of the hold-down ropes about the SNO+ detector to make a data-direct measure of the K40 background. This measurement can then be used to quantify the K40 background from other sources such as the Acrylic Vessel (AV) and the scintillator inside the detector.
Astronomers use a variety of particle messengers (e.g., EM radiation, neutrinos, cosmic rays) to make their observations, with each messenger providing unique information about the astrophysical object. To ensure that a wide range of data is available to analyse transient events, networks such as NASA’s Gamma-ray Coordination Network (GCN) distribute automated notices from certain telescopes immediately after they detect a potential astronomical event. Other telescopes use these notices to inform their follow-up observations, which are then distributed via the same network.
GCN has provided a useful service since 1997, but the legacy of its initial design means that the data it distributes is not usable for automated analysis. To ease these challenges, we created the High-Energy Transients Database (hetDB). Developed with a principal focus on making data from IceCube’s neutrino notices (and follow-up circulars) usable for analysis, it extracts information from GCN notices and circulars and stores them in a persistent database. The database has an idiomatic Python interface, designed for use by one unfamiliar with CN’s internal workings. This allows one to easily integrate information from GCN with other data, such as the Fermi catalog of gamma ray sources and reports of historic gamma-ray flux from each.
In this talk, I will present the status of hetDB development and speak to future plans in the field.
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) is a United Nations General Assembly adopted treaty with basic obligations stating that no ratified nation may detonate nuclear explosions. To detect possible nuclear events, xenon isotopes stemming from such nuclear events can be found in atmospheric air samples through beta-gamma coincidences.
The CTBT system can detect these beta-gamma coincidences using two detectors in conjunction. Beta particle emissions are detected by a PIPS detector. Gamma ray emissions are detected by a High Purity Germanium (HPGe) detector. Simultaneous beta particle and gamma ray detections indicate a coincidence.
Commissioning this system at SNOLAB’s underground clean lab is of interest to significantly reduce background radiation present on the surface of the earth.
Bringing the CTBT system into working order requires both the PIPS and HPGe detectors be brought into accurate operation, reached when a source’s beta or gamma spectrum is well detected compared to theoretical values.
For the PIPS detector, quantities such as rise times and flat tops for pulse processing were adjusted at the recommendation of Health Canada, removal of (spectrum) noise generating components, and adjustment of gain. The HPGe detector was calibrated similarly to the other Germanium detectors underground at SNOLAB, notably the programmatic optimization of flat top and rise time pulse processing, and high voltages levels.
The final component in commissioning was the adjustment of liquid nitrogen (LN2) flow rate to purge radon from the source chamber. A preferred LN2 flow rate is determined between three levels.
This talk will outline the trials and steps taken to bring the CTBT system into its current state of operations, with additional preliminary results.
MATHUSLA is a Long Lived Particle detector that will start functioning in 2025. It will be placed at the (LHC) Lard Hydron Collider at CERN. The detection of Long Lived Particles requires precise triggering and background setting, therefore allowing the MATHUSLA detector to tune these settings as opposed to LHC, where it is impossible to detect such particles. The result of the MATHUSLA experiment will help us research and detect particles that are lifetime stretches up to Big Bang Nucleosynthesis.
The experiment at the University of Toronto lab focused on testing wavelength shifting fibers (WLSFs) with Silicon Photo-multipliers (SIPMs). Silicon Photo-multipliers, electric pulses, scintillating bars, and an oscilloscope were used in the testing of the fibers. Our work consisted of measuring the time and energy delay and the attenuation of the fibers, thus allowing us to use the cosmic ray setup to detect cosmic rays with the best fiber chosen. The result of the experiment will help the team at the University of Toronto to build a small model of the MATHUSLA detector
https://laurentian.zoom.us/j/93614635887?pwd=MUR1NjJyaDdzN3d3aUl5T2tJQ0tvdz09
The nEXO (next Enriched Xenon Observatory) collaboration is searching for lepton-number violating neutrino-less double beta decays (0νββ) in Xe-136. A positive observation would require the neutrino to be its own anti-particle, i.e. the neutrino has to be a Majorana particle, and shed light on various open questions in neutrino physics and physics beyond the standard model.
A Ba-tagging technique is being developed at McGill with the focus on the extraction and identification of Ba-ions from xenon gas to push the sensitivity of the detector further in order to clearly distinguish the ββ events from the background events. The in-gas laser ablation ion source (IGLAS) is built to create laser-driven ions in a gaseous environment for the purpose of systematic studies and calibration of various components of the Ba-tagging scheme. Simulations of the current geometry of the IGLAS were performed in the software SIMION to study the efficiency of ion transport, the time of flight and mobility of Cu+ ions in Ar and Xe gas in various conditions.
The SNO+ experiment is located 2km underground at SNOLAB with the primary purpose of studying neutrino interactions. The detector is a 12-metre-diameter acrylic sphere filled with 780 tonnes of liquid scintillator, which produces light when a charged particle passes through it. This volume is surrounded by almost 10000 photomultiplier tubes (PMTs), which detect the light from the scintillator. At SNO+, a minimal, well-understood detector background is crucial to obtain meaningful data, thus the background is constantly analyzed. Through this, a new population of background events were observed that do not fall into the SNO+ analysis region, but are interesting in their spatial, temporal, and energy distributions. This presentation will describe these events, as well as the work we’ve done to narrow down their source.
High energy physics experiments rely on sensitive optical devices such as Silicon photomultiplier tubes (SiPMs). While SiPMs have an excellent single photoelectron resolution and linearity, they exhibit strong temperature dependence. It is important to properly test light sources and temperature-based systems prior to testing a SiPM. My work focused on designing a laboratory setup to test the wavelength dependant performance of optical sensors such as SiPMs under a temperature-controlled environment. I will discuss the performance of the current test setup and plans for the optical sensor performance tests.
The radiopurity.org database has proven to be a valuable resource for the low background physics community as a tool to track and share assay results. A SNOLAB instance for screening results of sample measured via HPGe underground at SNOLAB is available at hpge-radiopurity.snolab.ca.
This talk will describe the recent progress in uploading new data onto the database and further upgrades which are underway in collaboration with Pacific Northwest National Laboratory.
The nEXO experiment is designed with the goal of observing neutrinoless double beta decay by placing 5000kg of liquid xenon (enriched to 90% in 136Xe) within a time projection chamber (TPC). To achieve the desired sensitivity to a neutrinoless double beta decay half-life of 1.35 x 1028 years, experimental backgrounds need to be characterized and reduced as much as possible. To limit cosmogenic backgrounds, the TPC is immersed in a large water tank over two kilometres underground. Despite these efforts, cosmogenic muons can still cause backgrounds at this depth. It is therefore important to be able to identify which muons pass close enough to the TPC to cause this background. Some of this can be achieved by tagging cosmic muons passing through the water tank using the ~125 photomultiplier tubes (PMTs) in order to detect Cherenkov radiation caused by muons. Chroma, a ray tracing simulation software, is being used to optimize the Outer Detector design. A crucial part of this simulation is understanding and verifying the optical properties used in Chroma. The reflection, refraction, absorption, and detection of photons are all properties which are crucial to the integrity of the simulation. Once these properties are incorporated in the simulation, we will be able to determine our trigger conditions.
This talk summarizes the ways in which we verify the optical parameters used in Chroma for nEXO’s Outer Detector and determine trigger conditions, balancing efficiency and PMT backgrounds.
Liquid scintillation has been a standard means of particle detection for decades; typically, through the use of a transparent medium containing molecules which transfer the kinetic energy of incoming particles into photon emissions through deexcitation, allowing the energy of the particle to be measured. An example of these liquids is the solvent LAB and the fluor PPO, as used in SNO+.
The question explored here is: what if one were to take this scintillating mixture, and add some PRS, a surfactant, along with a few droplets of water in order to make it opaque – from faintly milky to very cloudy? What are the light collection properties of such liquids that would have an application in the new LiquidO detection technique? Throughout the summer this is what I have explored.
Many samples were made by adding various concentrations of PRS to the standard of LAB and PPO, along with different amounts of water droplets. These samples were then exposed to a Cesium-137 source in thin layers of scintillator, as well as thicker layers, while being attached to a PMT in a dark box in order to capture the light. Maestro was then used to compare the various sample’s ability to capture the light.
https://laurentian.zoom.us/j/93614635887?pwd=MUR1NjJyaDdzN3d3aUl5T2tJQ0tvdz09
The proposed nEXO detector aims to search for neutrinoless double beta decay (0$\nu\beta\beta$) in a five-tonne enriched liquid Xe-136 time projection chamber (TPC). The search in a TPC offers the unique possibility to locate, extract from the detector volume, and identify the $\beta\beta$-decay daughter Ba-136. The addition of the ba-tagging technique in a future upgrade to nEXO has the potential to eliminate essentially all background events, except those from $\beta\beta$-decays. The approach being explored by Canadian groups uses a capillary to first extract the candidate Ba ion with a small volume of liquid Xe from the detector. The liquid Xe then undergoes a phase change before the daughter ion is guided by an radio-frequency (RF) carpet to an RF funnel, where it will be separated from neutral particles and transported to vacuum. The ion is then transported to a linear Paul trap and a multi-reflection time of flight mass spectrometer for identification and mass confirmation respectively.
To characterise the in-gas ion transport, an In-Gas Laser Ablation Source (IGLAS) is being developed to study the production of ions in a controlled environment. A pulsed 532 nm Nd:YAG laser is focused on a metal target to ablate ions from the surface. Ions are drifted by an applied electric field and collected. The IGLAS has been previously tested in vacuum while studies on ion production and transport in high-pressure Xe and Ar gas are ongoing. The status of the IGLAS will be discussed.
DEAP-3600 at SNOLAB uses 3600 kg of liquid argon to search for dark matter. The argon used must be pure at the part per billion level for the detector to operate properly and for the pulse-shape discrimination to work. In this talk I will discuss the recommissioning of the DEAP-3600 argon process system for the third fill of the detector that will take place this winter. This talk will follow the flow of argon through the purification loop, which consists of the process pump, hot zirconium getter, radon trap, condenser, and boiler.
To meet the required sensitivity of current and forthcoming rare-search event experiments within the astroparticle physics community, it is important to select radiopure shielding materials to protect from background radiation and keep the radioactive background to a minimum. Measurement of surface activities via the XIA Ultra-Lo 1800 detector will be discussed for common components of the shielding of astroparticle experiments, such as copper and Polytetrafluoroethylene.
I talk about The minimum distance which the electric field of point charge still existed by using Limit at very close distance
https://laurentian.zoom.us/j/93614635887?pwd=MUR1NjJyaDdzN3d3aUl5T2tJQ0tvdz09
Silicon Photomultipliers (SiPMs) are compact arrays of single-photon avalanche diodes (SPADs). The Vacuum Emission Reflection Absorption (VERA) apparatus is a custom setup developed at TRIUMF, used to measure photon detection efficiency and reflectivity of SiPMs in a vacuum chamber at cryogenic temperatures. Photon detection efficiency is one of the most critical parameters of any single photon detector and plays a role in developing all SiPMs. This characterization of SiPMs Quantifies the ability of a single photon detector to detect photons. This can also be described as the ratio between detected photons and photons arriving at the detector. The VERA apparatus uses several experiments to collect data from SiPMs and photodiodes to measure photon detection efficiency.
The Single Photon Air Analyzer project (SPAA) aims to develop proof-of-concept for a new optical particulate matter detector. This detector is novel in that it uses the scattering angle of photons to determine the size of particulate matter in air. An array of silicon photomultipliers (SiPMs), solid-state sensors capable of single-photon resolution, will count small numbers of photons scattered by the particulate. SiPMs are more sensitive than traditional photodiodes used by existing particulate matter detectors. The SPAA concept, if successful, would provide a portable, low-cost, and highly sensitive particulate matter detector. In this presentation I will review the motivation for this detector, its fundamental operating principles, and present preliminary data.
The study of new BiPo214 nhit cuts for 1.1 g/L and 2.2g/L. Previously, the SNO+ project was using the nhit cuts from 0.5g/L, we are going to evaluate how the nhit cuts changes and how that affects the tagging efficiency. After looking at how the different nhit cuts affect the tagging efficiency we also have to consider the accidental rate.
CUTE is a test facility currently testing high voltage detectors for SuperCDMS in hopes that they will one day detect dark matter. These detectors need to be calibrated to get an understanding of the response we get during a run. CUTE currently has a system installed that is used for gamma calibration. This talk will outline the progress made during the winter and spring term for the development and installation of a neutron calibration system at CUTE that will improve our understanding of the detectors nuclear recoil response.
Devices that can produce images of gamma ray sources in the energy range from 100 keV to several MeV have applications in astronomy as well as safety and security. The technology of choice in this energy range relies on the phenomenom of Compton scattering. A gamma ray scatters in the front layer of the detector, giving up some of its energy. The scattered, lower-energy photon is then absorbed in the back layer of the detector. Knowing the energies and positions of these two events allows one to reconstruct the energy and incident direction of the gamma ray. Compton imagers are usually constructed from two layers of pixelated detectors but their cost and complexity scales as N^2 where N is the number of pixels in a row of an NxN array. Here we report on a way to reduce the scaling to 2N, by reading out the detector by rows rather than pixels.
There are several candidates in the broad search for dark matter. Prevalent among them is an ultralight particle interacting via some scalar field, which would induce an isotropic strain on all condensed bodies. Our dark matter detector (HELIOS) aims to measure this effect in the 2 kHz frequency range, and thus requires strong vibration isolation in that range. We have created a new kind of vibration isolation suspension, which uses circular catherine-wheel springs cut out of oxygen-free copper sheets. These springs utilize significantly less vertical height, meaning more isolation stages can fit above the shell of our dilution refrigerator. This increases the overall attenuation. Since stainless-steel coil springs also have very poor thermal conduction, similar suspensions are generally thermally bypassed using copper wires or rods. This implementation is unnecessary with this new design, as the entire suspension is made out of copper. Strong attenuation was measured in the desired frequency range, as well as clear improvement in overall attenuation as more stages were added.
Astrophysical measurements of positron and electron cosmic ray spectra by PAMELA and ATIC experiments have observed a rise in the positron fraction starting at $E \sim 10$ GeV and extending up to $E \sim 100$ GeV, as well as broad excess in the total $e^{+}+e^{-}$ spectrum extending from several hundred GeV to $O$ (TeV). This could be explained with dark matter annihilation $\chi {\chi}$ into $e^{+}e^{-}$; but it requires larger annihilation cross section than what is allowed by thermal relic abundance. And also to avoid antiproton constraints from the observational data of the PAMELA experiment and to generate hard $e^{+}e^{-}$ spectra, the dark matter should annihilate largely into leptons, pions, kaons and other light states.
Annihilation of the dark matter into the new force carrier, which would then decay into kinematically accessible standard model states. A new GeV-scale force in the dark sector could therefore naturally generate both a large annihilation cross-section via Sommerfeld enhancement and the observed $e^{+}e^{-}$ spectra.
If dark matter is self-interacting, either via exchange of standard model gauge boson or due to some new force, then the Sommerfeld enhancement must be taken into account when computing annihilation cross sections in the present day galactic halo.
The presence of light force carriers$(~ \text{GeV})$ coupling to the dark matter would naturally lead to atypical dark matter annihilation signatures with zero or small branching ratios into hadrons, gauge bosons and hard spectrum of leptons, pions and other lighter states. The provided mass of the force carrier was less than twice the proton mass so, no excess antiprotons would be produced.
In general GeV scale, a new dark force could naturally generate both a large annihilation cross-section via Sommerfeld enhancement and the observed $e^{+}e^{-}$ spectra without violating observational antiproton constraints.
In order to consider various dark matter annihilation models, Experimental data of PAMELA and ATIC needs high cross-section values or boost factors than our cosmological expectation. We introduce a new force or mediator particle to address these observational anomalies. In our model, a dark matter particle annihilates in the standard model particle via a new force carrier $\phi$. And this force carrier exchanges many times before annihilation in a standard model particle, due to this new force our estimated cross-section enhances via the Sommerfeld factor of Yukawa Potential. Additionally, with this phenomenology, we can naturally fit the astrophysical data from PAMELA and ATIC without the need for a larger boost factor.