Abstract

The effects of sampling jitter on the measurement of amplitude and phase of a sinusoidal signal have been investigated. The parameters are determined by the fast Fourier transform (FFT) algorithm, which processes the samples following data acquisition. Results are expressed in terms of the mean values and variances of the measured parameters. Computer simulation results, based on Gaussian jitter show the changes of amplitude and phase standard deviations versus the changes of jitter standard deviation and signal amplitude for different numbers of samples, N. It is shown that amplitude and phase standard deviations are inversely proportional to square root N, and proportional to sigma /sub R/, the standard deviation of phase noise (jitter). This dependence on sigma /sub R/ means proportionality to both signal frequency and time jitter standard deviation. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call