Abstract

Exposing optical silica fibers to radiative environments leads to an increase of fiber attenuation. This gamma-sensitivity of the fibers is strongly wavelength dependent. Many papers already mentioned the strong radiation-induced attenuation (RIA) in the UV and visible ranges, which is explained by radiation-induced defects absorbing in these spectral ranges. However, the origin of RIA at longer wavelengths (lambda > 1000 nm) is less clear. An exception is phosphorous-doped fibers for which P1 defects absorbing around 1700 nm have already been highlighted. For fibers with no phosphorus, the RIA at near-infrared (NIR) wavelengths is usually assumed to be small as it results from the UV-visible absorption tail, which decreases with increasing wavelength. In this paper, we study three prototype silica based optical fibers and show that the RIA does not monotically decrease with increasing wavelength, highlighting RIA-contributions having their origins at NIR-wavelengths. We show that these NIR-absorbing defects are generally the main contributor to RIA at telecommunication wavelengths (1310 nm and 1550 nm), the impact of UV-visible absorption tail being secondary only. The nature of defects involved in these NIR absorptions depends on fiber composition. For fibers with no phosphorous, we propose self trapped hole defects (STH) as origin.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.