Abstract

Research on optical amplifiers has highlighted how ionizing radiation negatively impacts the performance of erbium-doped fiber amplifiers (EDFAs), through the degradation of their gain. The amplitudes and kinetics of this degradation are mainly explained by the radiation-induced attenuation (RIA) phenomenon at the pump and signal wavelengths. In this work, the gain degradation of a radiation tolerant EDFA (exploiting a cerium-co-doped active optical fiber) induced by ionizing radiation up to 3 kGy (SiO2), at two dose rates, 0.28 Gy/s and 0.093 Gy/s, is studied through an experimental/simulation approach. Using a home-made simulation code based on the rate and power propagation equations and including the RIA effects, the radiation-dependent performance of EDFAs were estimated. The variations in the spectroscopic parameters caused by irradiation were also characterized, but our results show that they give rise to EDFA gain degradation of about 1%. To overcome the issue of overestimating the RIA during the radiation tests on the sole active rare-earth-doped fiber, a new RIA experimental setup is introduced allowing us to better consider the photobleaching mechanisms related to the pumping at 980 nm. A good agreement between experimental and simulated gain degradation dose dependences was obtained for two different irradiation conditions, thus also validating the simulation code for harsh environments applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call