Speaker
Description
Central and mid-peripheral heavy-ion collisions produce a significant underlying event superimposed on the hard scattering of interest. So far, traditional underlying event mitigation algorithms introduced for heavy-ion often impose limitations in their applicability. The jet typically has to be clustered in the presence of the underlying event. Observables like the energy have to be corrected for the underlying event offset afterward. Furthermore, while machine learning methods can have improved performance for underlying event mitigation, no such approach has been proposed that can be trained without seeing jets from clean hard scatterings — trained in-situ on experimental ion (e.g., PbPb) data only, independent of any Monte Carlo or detector modeling. Overcoming this constraint is particularly crucial for heavy-ion jet measurements, as any reliance on Monte Carlo inserts a theory uncertainty in modeling heavy-ion jets. For the first time, an algorithm is presented that removes all these limitations: (a) the algorithm preserves the complete event information, (b) it retains only positive transverse momentum particles without the need for negative or counterbalancing particles, (c) it uses machine learning to optimize the signal-to-noise ratio, and (d) it can be trained in-situ using only ion data. Specifically, it overcomes the lack of ground truth clean events using self-supervised machine learning. The performance of the algorithm is demonstrated using jet kinematics, substructure, and axes observables, and compared to existing underlying event mitigation methods. The implication of this algorithm for future and novel measurements is discussed.