Abstract

Raman Distributed Temperature Sensors (RDTSs) offer exceptional advantages to monitor the envisioned French deep geological repository for nuclear wastes, called Cigeo. Both $\gamma $ -ray and hydrogen release from nuclear wastes can strongly affect the temperature measurements made with RDTS. We present experimental studies on how the performances of RDTS evolve in harsh environments like those associated with $\gamma $ -rays or combined radiations and ${{\rm H}_2}$ release. The response of two standard and one radiation tolerant multimode fibers (MMFs) are investigated. In all fibers the differential induced attenuation between Stokes and anti-Stokes signal, ${({{\alpha _{\rm AS}} - {\alpha _{\rm S}}})}$ causes a temperature errors, up to $30^\circ {\rm C}$ with standard multimode fibers (100 m) irradiated at 10 MGy dose. This degradation mechanism that is more detrimental than the radiation induced attenuation (RIA) limiting only the sensing range. The attenuation in the [800-1600 nm] spectral range at room temperature is explored for the three fibers $\gamma $ -irradiated and/or hydrogen loaded to understand the origin of the differential RIA. We show that by adapting the characteristics of the used fiber for the sensing, we could limit its degradation but that additional hardening by system procedure is necessary to correct the T error in view of the integration of our RDTS technology in Cigeo. The current version of our correction technique allows today to limit the temperature error to $\sim 2^\circ {\rm C}$ for 10 MGy irradiated samples.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call