The gain degradations of an Er‐doped fiber amplifier (EDFA) during the exposure of its active Er‐doped fiber to 40 keV X‐rays (≈2.7 mrad(SiO2) s−1 up to 300 krad) at three different temperatures: −40, 25, and 120 °C, are characterized. The spectral dependence of the fiber radiation‐induced attenuation (RIA) is monitored in situ in the 900–1600 nm spectral range, highlighting a combined temperature and radiation effect on the near‐infrared RIA levels and kinetics. At the system level, the kinetics of EDFA gain degradation are only slightly affected by varying the irradiation temperature for the tested backward pumping EDFA configuration. On the theoretical side, a homemade computer code based on the particle swarm optimization and the rate equations to model the radiation behavior of EDFA is used, considering only the RIA impact on its gain. Despite a good agreement between experimental and simulation results below the dose of 70 krad corresponding to current space missions, the comparison between the modeled and measured gain degradation kinetics at higher doses shows that a more precise modeling of the temperature impact on the absorption properties of the erbium ions is necessary to better reproduce the EDFA radiation behavior for future space missions.