Abstract

We theoretically and experimentally evaluate the influence of the bandwidth limitation and group delay ripple (GDR) of linearly chirped fiber Bragg gratings (LCFBGs) on the all-optical clock recovery utilizing the temporal Talbot effect. To simulate the reasonably arbitrary shape of the GDR of LCFBGs, the generalized distribution function model of GDR was proposed and utilized. The quality of recovered clock pulses was evaluated by using the proposed parameters "peak variation" and "pulse visibility." Simulation results indicated that both the signal pulses with < 1% duty factor and LCFBGs with the bandwidth of > approximately 125 times the bit rate, 10 nm for 10 Gbit/s input signal, were required to obtain both < approximately 20% peak variation and > approximately 17 dB pulse visibility of the clock pulses recovered from 2(7)-1 pseudorandom bit sequence (PRBS) pulses, and indicated that the LCFBGs with < approximately 20 ps peak-to-peak GDR could recover the clock pulses from the 10 Gbit/s signal with almost the same quality as that recovered with an ideal LCFBG (with no GDR). In addition, it was revealed that < approximately 20 ps peak-to-peak GDR did not degrade the timing jitter reduction effect. Qualitative agreement of the experimental and simulation results justified our analytical methods. Our analytical approach and results will be very useful to practically design the LCFBGs for an all-optical clock recovery circuit based on the temporal Talbot effect.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call