Abstract
The performance of generalized correlation (GC) methods for time-delay estimation when a priori knowledge of the received source spectrum is limited is improved. The treatment is restricted to finite-time observation of a single low-pass source, where the uncertainty in its autospectrum is characterized by a single unknown parameter. The constrained processor (CP) selects an optimum postcorrelator filter from a finite set of alternatives that are designed to span the expected range of source spectra. An approach proposed previously for this situation is the LMS adaptive filtering method for time delay estimation (LMSTDE). For purposes of comparison, both the theoretical and simulation performance of the LMSTDE method are examined. Simulation experiments indicate that LMSTDE offers performance comparable to conventional GC methods without a priori knowledge of the source spectrum. The CP in the same setting, however, can yield significantly lower variance time delay estimates than either the LMSTDE or conventional GC methods, with performance approaching the Cramer-Rao lower bound. >
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Acoustics, Speech, and Signal Processing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.