Abstract

This paper discusses the problem of bias errors introduced when frequency response and coherence functions are estimated for systems in which a time delay is present. It is shown that a time delay between the input and output of a linear system causes the cross-spectral density to fluctuate as a function of frequency, where the period of fluctuation is dependent on the magnitude of the time delay. If the analysis bandwidth is not sufficiently small, or if frequency averaging is used to reduce the variance, the cross-spectral density estimate will have a negative bias. This error is transferred to estimates of the frequency response and coherence functions. Theory is developed showing the dependence of the bias errors on the time delay, the bandwidth and the length of the sample record of the input/output processes. Two experiments were designed to check the theory. In one experiment a loudspeaker, driven by white noise, and a microphone were used. The time delay was controlled by knowing the propagation time for acoustic waves between the loudspeaker and the microphone. In a second experiment a tape recorder was used with a fixed spacing between the record/playback heads to introduce a time delay. For both experiments comparison with theory was good.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.