Abstract

Radio frequency (RF) receivers are common in many modern communications and radar systems, and they suffer from many performance degradation factors due to hardware limitations. Among all performance degradation contributors, phase noise and time jitter are particularly troublesome since they cause random errors which are difficult to compensate. The local oscillator in the receiver front end is a major contributor of phase noise, while the analog-to-digital converter (ADC) introduces time jitter. It is desired to know the accumulated effect of individual phase noise sources and time jitter. The total effect of all phase noise and jitter can be represented by an accumulated phase noise term at the ADC's output, called total phase noise (TPN) in this brief. The focus of this work is on measuring and modeling TPN in the RF receiver by applying optimization techniques. In contrast to traditional phase noise measurement that typically requires a high-quality tunable downconverter, a digital approach using the data captured directly by the RF receiver is proposed. In addition, iterative optimization-based TPN spectral model fitting and statistic modeling are introduced. The model is examined using the measured TPN. It is confirmed that the RF receiver TPN can be viewed as a wide-sense stationary zero-mean Gaussian process with certain spectral profile.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call