Harnessing the Data Revolution (HDR), is an effort by the National Science Foundation (NSF) to promote the exploration of fundamental scientific questions using data-driven techniques. To raise interest in these approaches, and the HDR community, we have developed a Machine Learning (ML) challenge for anomaly detection, taking advantage of widespread data from several HDR institutes. This...
The High-Luminosity Large Hadron Collider (HL-LHC), anticipated to begin operations in 2029, will generate data at an astounding rate on the order of 100 terabits per second. To efficiently process and filter these data, the Compact Muon Solenoid (CMS) experiment
relies on the extremely low-latency Level-1 trigger, which uses Field-Programmable Gate Arrays (FPGAs). My project focuses on...
Particle tracking at Large Hadron Collider (LHC) experiments is a crucial component of particle reconstruction, yet it remains one of the most computationally challenging tasks in this process. As we approach the High-Luminosity LHC era, the complexity of tracking is expected to increase significantly. Leveraging coprocessors such as GPUs presents a promising solution to the rising...
Core collapse supernova explosions offer a rich potential of physics to explore. The emitted neutrinos are the first signals to reach the earth. Detecting these neutrinos and their direction can provide valuable information to optical detection systems in a multi messenger astronomy approach.
In liquid argon time projection chambers such as DUNE the charge interactions are the most abundant...
Decoding neural activity into behaviorally-relevant variables such as speech or movement is an essential step in the development of brain-machine interfaces (BMIs)and can be used to clarify the role of distinct brain areas in relation to behavior. Two-photon (2p) calcium imaging provides access to thousands of neurons withsingle-cell resolution in genetically-defined populations and therefore...
Field-Programmable Gate Arrays (FPGAs) are increasingly becoming pivotal in the advancement of artificial intelligence (AI) and deep learning applications. Their unique architecture allows for customizable hardware acceleration, which is instrumental in handling the intensive computational demands of modern AI algorithms.
Transmission Electron Microscopy (TEM) provides exceptional...
In this work we show advancements in follow-up methods for detection of electromagnetic counterparts to gravitational wave signals. These multi-messenger observations are important targets for their ability to unlock science including measurement of the Hubble constant, which is a current major effort in cosmology. In this work we include a data-driven heuristic to select anomalous flares...
Accurate estimation of subglacial bed topography is crucial for understanding ice sheet dynamics and their responses to climate change. In this study, we employ machine learning models, enhanced with Spark parallelization, to predict subglacial bed elevation using surface attributes such as ice thickness, flow velocity, and surface elevation. Radar track data serves as ground truth for model...
In time-domain astronomy, rapid classification of astronomical transients is critical for determining candidates for follow-up observations. With the advent of the Vera Rubin Observatory’s Legacy Survey of Space and Time, the backlog of astronomical data will increase by terabytes a night. Machine learning models capable of processing and analyzing large quantities of data can advance the...
Binary black hole mergers can be located by collecting and analyzing the unique gravitational wave signals they produce. Deep learning computational models, specifically Aframe, are used to identify and filter gravitational wave signals more accurately and in less time than traditional matched filtering analyses. The current machine learning model that we use, Aframe, was originally developed...
The next phase of high energy particle physics research at CERN will
involve the High-Luminosity Large Hadron Collider (HL-LHC). In preparation for
this phase, the ATLAS Trigger and Data AcQuisition (TDAQ) system will undergo
upgrades to the online software tracking capabilities. Studies are underway to
assess a heterogeneous computing farm deploying GPUs and/or FPGAs, together
with the...
Pixel detectors are highly valuable for their precise measurement of charged particle trajectories. However, next-generation detectors will demand even smaller pixel sizes, resulting in extremely high data rates surpassing those at the HL-LHC. This necessitates a “smart” approach for processing incoming data, significantly reducing the data volume for a detector’s trigger system to select...