Speaker
Description
Relativistic pair beams created in the intergalactic medium (IGM) by TeV gamma rays from blazars are expected to produce a detectable GeV-scale electromagnetic cascade, but the cascade component is absent in the spectra of many hard-spectrum TeV-emitting blazars. One common explanation is that weak intergalactic magnetic fields deflect the electron-positron pairs away from our line of sight. An alternative possibility is that electrostatic beam-plasma instabilities drain the energy of these pairs before a cascade can develop. Recent studies have shown that beam scattering by oblique electrostatic modes leads to minimal energy loss. But these modes might be suppressed by linear Landau damping (LLD) due to MeV-scale cosmic-ray electrons in the IGM. In this work, we explore the impact of LLD on the energy-loss efficiency of plasma instabilities in pair beams associated with 1ES 0229+200.
We find that LLD effectively suppresses oblique electrostatic modes, while quasi-parallel ones continue to grow. In this way LLD enhances the energy-loss efficiency of the instability by more than an order of magnitude, depending on the distance from the blazar. We plan to follow up with a quantitative analysis of how this increased efficiency influences the observable spectra of the GeV cascade.