Speaker
Description
Calorimetric space experiments have been employed for direct measurements of cosmic-ray spectra above the TeV region. According to several theoretical models, relevant features in both electron and nuclei fluxes are expected. Unfortunately, sizable disagreements between current results of different space calorimeters are presents. In order to improve future experiment accuracy, it is fundamental to understand the reasons of these discrepancies, especially since they are not compatible with the quoted experimental errors. Few articles of different collaborations suggest that a systematic error of few percent related to the energy scale calibration could explain these differences. In this work we analyze the impact of the non-proportionality of scintillating crystals light-yield (also known as “quenching”) on the energy scale of typical calorimeter. Space calorimeter are usually calibrated by employing minimum ionizing particles (MIP), e.g. non-interacting proton or helium nuclei, which feature different ionization densities with respect to particles included in showers. By using the experimental data obtained by the CaloCube collaboration, and a simplified model of the light-yield as a function of the ionization density, several scintillating crystals (BGO, CsI(Tl), LYSO, YAP, YAG and BaF2) are characterized. Then, the response of each crystal is implemented inside a Monte Carlo simulation of a typical space calorimeter: the latter is used to check the energy deposited by electromagnetic and hadronic showers. The results of this work show that energy scale obtained by MIP calibration could be affected by sizable systematic error if the non-proportionality of scintillation light is not properly taken into account.