Abstract

In modern satellite systems the uplink radio frequency (RF) wideband signal after downconversion to intermediate frequency (IF) is input into the analog-to-digital converter (ADC), the output of which is digitally processed for the purpose of channelization and switching of signals. In such systems, a major implementation limitation is the performance of the ADC. The output of the ADC contains distortion arising from various factors, such as quantization noise, clipping, and jitter in the sampling clock. The quantization and clipping effects have been analyzed in some detail in many previously published papers by the author and others. This paper presents a detailed analysis of the impact of the sampling clock phase jitter on the signal-to-noise power ratio at the output of the analog-to-digital converter. The results are derived in terms of the power spectral density of the noise and the noise variance for the general case. The results are also specialized for the cases of baseband signals, broadband bandpass signals and narrowband signals. The paper also presents a novel approach for the simulation of clock jitter using an interpolation technique and the complex baseband representation of the bandpass signals. A comparison of the results obtained from both the analysis and the simulations shows a very close match between the two.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.