Abstract

Several methods exist to measure the group delay of a fiber Bragg grating. Here, we compare two such methods, namely the Hilbert transform (HT) of the device transmission spectrum and standard Fourier spectral interferometry. Numerical simulations demonstrate that both methods work not only for ideal, lossless devices but also for ones with realistic absorption. Experimental measurements show that the HT is more straightforward to implement and is significantly less susceptible to phase noise, which can significantly reduce the standard deviation between measurements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call