Particle detectors at accelerators generate large amount of data, requiring analysis to derive insights. Collisions lead to signal pile up, where multiple particles produce signals in the same detector sensors, complicating individual signal identification. This contribution describes the implementation of a deep learning algorithm on a Versal ACAP device for improved processing via...
When electrons are produced at the LHCb experiment, they usually have a long way to go until they reach the electronic calorimeter (ECAL). In this journey, they will traverse layers and layers of material from the detector, which will cause them to lose energy due to Bremsstrahlung emission in the form of photons. When these photons are emitted before the magnet of the detector, the electrons...
In the LHCb experiment, the calorimeters (ECAL and HCAL) are is divided into regions with different dimensions and sensors sizes.
The energy is measured by considering signal in the clusters of CALO cells. In this talk, a new method of clusterisation for the LHCb ECAL is proposed.
In this work, the performance for the identification of electrons and misidentification of pions as electrons is measured for 2024 LHCb data. The detector and the reconstruction have changed significantly for Run 3, so it is important to validate the electron identification performance with the early data. Electron identification is evaluated by the electron reconstruction algorithms in the...
Imaging Atmospheric Cherenkov Telescopes (IACT) use combined analog and digital electronics for their trigger systems, implementing simple but fast algorithms. Such trigger techniques are forced by the extremely high data rates and strict timing requirements. In recent years, in the context of a new camera design for the Large-Sized Telescopes (LSTs) of the Cherenkov Telescope Array (CTA)...
The LHCb experiment has proved its huge potential in the field of heavy ion collisions. However, PbPb collisions produce a high occupancy regime which is not only challenging at hardware level, also at software level. In order to keep the high quality track and PID reconstructions shown in pp collisions, some modifications to LHCb HLT2 trigger reconstruction are needed, especially regarding...
The study of power consumption and sustainability of LHC trigger systems is imperative in view of the next high luminosity era for the LHC collider, which will largely increase the output data rate beyond several tens of TB/s. In this talk, we will show the work performed at IFIC in the context of the High-Low project, including some of the proposals that can be considered to optimize energy...
This thesis presents a set of optimization efforts within the Allen framework at CERN’s
LHCb experiment, with a specific focus on increasing throughput and obtaining determin-
istic behaviour on both the CPU and GPU executions. The key area of development are the
algorithms working with events containing luminosity data, and their tests. These algorithms
were detected to be a bottleneck...
The escalating demand for data processing in particle physics research has spurred the exploration of novel technologies to enhance efficiency and speed of calculations. This study presents the development of a porting of MADGRAPH, a widely used tool in particle collision simulations, to FPGA using High-Level Synthesis (HLS).
Experimental evaluation is ongoing, but preliminary assessments...
The Overlap Muon Track Finder (OMTF) is a key subsystem of the CMS L1 Trigger, and for the CMS phase-2 upgrade during the High-Luminosity Large Hadron Collider era, a new version of the OMTF is being developed. This upgraded version, implemented on a custom ATCA board with a Xilinx UltraScale+ FPGA and 25 Gbps optical transceivers, focuses on improving the muon trigger algorithm and input data...
In view of the upcoming high-luminosity operations of the LHC (HL-LHC), significant upgrades in the CMS trigger system are foreseen to maintain high physics selectivity with finer granularity and more robust readout electronics. The present Drift Tube (DT) on detector electronics will be replaced by new readout boards which will perform the time digitisation of the signals inside...
Phase-2 CMS upgrade will replace the trigger and data acquisition system in preparation for the HL-LHC. This upgrade will allow a maximum accept rate of 750kHz and a latency of 12.5us. To achieve this, new electronics and firmware are being designed with an expectation of significantly improving the physics reach of the current system.
In this talk we describe the first version of an...
Particle Identification (PID) is crucial for all analysis at LHCb. The PID machinery at this experiment includes both hardware and software resources to distinguish between electrons, kaons, pions, muons and protons. A key part in particle identification is the estimation of the efficiencies of PID selection criteria through data-driven methods. For that, the tool PIDCalib (Particle...
High-energy physics is at the forefront in the transformation of research into a computing-intensive field. This process, already challenging for large collaborations at the LHC, can strain the resources of smaller teams who are facing technical challenges that require a high level of coding ability.
The eScience center [https://www.esciencecenter.nl/] is an organisation funded by the Dutch...
VirtuaLearn3D++: Algorithms from unstructured data spaces. From geography and engineering to high-energy physics.
Finding general solutions for geometric problems has been a complex concern thoroughly studied since the XIX century. The fundamental contribution of David Hilbert through the Nullstellensatz equipped us with a dictionary between algebra and geometry. Then, any geometry that can...
In this talk the new "Downstream" algorithm developed at LHCb is reviewed, at both HLT1 and HLT2 trigger levels. At HLT1, the algorithm is able to reconstruct and select very displaced vertices in real time, making use of the Upstream Tracker (UT) and the Scintillator Fiber detector (SciFi) of LHCb, and being executed on GPUs inside the Allen framework. In addition to an optimized strategy, it...
One of the main challenges at LHCb is coming from the reconstruction of tracklets in the Scintillator Fiber detector (SciFi) in real-time, due to the large hit combinatorics in this detector. The new “Faraway” algorithm which has an innovative strategy for the reconstruction and vertexing of two SciFi-tracks is presented here, with the present performance and future prospects. The development...
BuSca is a prototype algorithm at LHCb designed for real-time BSM particle searches, and focused on downstream reconstructed tracks, detected exclusively by the UT and SciFi detectors. By projecting physics candidates onto 2D histograms of flight distance and mass hypotheses at 30 MHz rate, BuSca identifies hot spots indicative of potential candidates of new particles, thereby providing...
The expected increase in the recorded dataset for future upgrades of the main experiments at the Large Hadron Collider (LHC) at CERN, including the LHCb detector, while having a limited bandwidth, comes with computational challenges that classic computing struggles to solve. Emerging technologies such as Quantum Computing (QC), which exploits the principles of superposition and interference,...
Designing the next generation colliders and detectors involves solving optimization problems in high-dimensional spaces where the optimal solutions may nest in regions that even a team of expert humans would not explore.
Furthermore, the large amount of data we need to generate to study physics for the next runs of large HEP machines and that we will need for future colliders is staggering,...
We propose an optimization system for a Parallel-Plate Avalanche Counter with Optical Readout designed for heavy-ion tracking and imaging. Exploiting differentiable programming, we model the reconstruction of the position for different detector configurations and build an optimization cycle that minimizes an objective function. We analyze the performance improvement using this method,...
In recent years, the incorporation of new hardware architectures at the trigger level has significantly enhanced the potential of LHC experiments. This includes the use of FPGAs and GPUs for real-time fast track reconstruction. In this talk, we will review the key aspects of these advancements, examine current technology trends, and explore the emerging strategies being developed by the...