Abstract

Calorimetric space experiments were employed for the direct measurements of cosmic-ray spectra above the TeV region. According to several theoretical models and recent measurements, relevant features in both electron and nucleus fluxes are expected. Unfortunately, sizable disagreements among the current results of different space calorimeters exist. In order to improve the accuracy of future experiments, it is fundamental to understand the reasons of these discrepancies, especially since they are not compatible with the quoted experimental errors. A few articles of different collaborations suggest that a systematic error of a few percentage points related to the energy-scale calibration could explain these differences. In this work, we analyze the impact of the nonproportionality of the light yield of scintillating crystals on the energy scale of typical calorimeters. Space calorimeters are usually calibrated by employing minimal ionizing particles (MIPs), e.g., nonshowering proton or helium nuclei, which feature different ionization density distributions with respect to particles included in showers. By using the experimental data obtained by the CaloCube collaboration and a minimalist model of the light yield as a function of the ionization density, several scintillating crystals (BGO, CsI(Tl), LYSO, YAP, YAG and BaF2) are characterized. Then, the response of a few crystals is implemented inside the Monte Carlo simulation of a space calorimeter to check the energy deposited by electromagnetic and hadronic showers. The results of this work show that the energy scale obtained by MIP calibration could be affected by sizable systematic errors if the nonproportionality of scintillation light is not properly taken into account.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call