Abstract

Ultra-wide-band analog-to-digital (A/D) conversion is one of the most critical problems faced in communication, instrumentation, and radar systems. This paper presents a comprehensive analysis of the recently proposed time-stretched A/D converter. By reducing the signal bandwidth prior to digitization, this technique offers revolutionary enhancements in the performance of electronic converters. The paper starts with a fundamental-physics analysis of the time-wavelength transformation and the implication of time dilation on the signal-to-noise ratio. A detailed mathematical description of the time-stretch process is then constructed. It elucidates the influence of linear and nonlinear optical dispersion on the fidelity of the electrical signal. Design issues of a single-sideband time-stretch system, as they relate to broad-band operation, are examined. Problems arising from the nonuniform optical power spectral density are explained, and two methods for overcoming them are described. As proof of the concept, 120 GSa/s real-time digitization of a 20-GHz signal is demonstrated. Finally, design issues and performance features of a continuous-time time-stretch system are discussed.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.