Abstract

Recently, it has been shown that the intensity time-bandwidth product of optical signals can be engineered to match that of the data acquisition instrument. In particular, it is possible to slow down an ultrafast signal, resulting in compressed RF bandwidth—a similar benefit to that offered by the Time-Stretch Dispersive Fourier Transform—but with reduced temporal record length leading to time-bandwidth compression. The compression is implemented using a warped group delay dispersion leading to non-uniform time stretching of the signal's intensity envelope. Decoding requires optical phase retrieval and reconstruction of the input temporal profile, for the case where information of interest resides in the complex field. In this paper, we present results on the general behavior of the reconstruction process and its dependence on the signal-to-noise ratio. We also discuss the role of chirp in the input signal.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call