Abstract

Analysis of waveform distortion in linear systems has been taught in several ways. The approach introduced here is based upon the mean-square difference between the input and output signals. Both deterministic and random signals can be accommodated, the only requirements for a mathematical analysis being knowledge of the input-signal autocorrelation function and the unit-impulse response of the linear system. The corresponding experimental procedure is based on comparison of the output signal with a replica of the input signal which has been delayed and attenuated in order to minimize the mean-square difference. Examples are included which analyze the effect of transmitting either a sinusoidal signal or a bandlimited random signal through an RC low-pass filter. Agreement between theory and experiment is good.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call