Speaker
Description
As a result of the need to improve searches for new particles and measure particle properties at CERN, the LHC is undergoing a high-luminosity upgrade which will provide a dataset ten times larger than the one currently available. To avoid complications in particle reconstruction as a result of the increased number of simultaneous interactions (pileup) per collision, a radiation-hard high-granularity calorimeter, which will measure the energy and position of particles with significantly improved precision, will be installed. This level of precision and higher levels of pileup represent a significant increase in complexity and data rates, which must be reduced by several orders of magnitude in real time to be processed. Thus, we aim to explore the application of machine learning to optimize the data com-
pression performed by the HGCal front-end electronics through the development of a conditional autoencoder to compress data automatically before transmission.