Abstract

We present a least-squares algorithm for time delay (range) estimation of dual-tone spectrally sparse signals that minimizes bias errors. Dual-tone waveforms obtain near-optimal delay estimation performance by maximizing the mean-square bandwidth of the signal spectra, reducing the error bound. However, the choice of estimator may introduce bias, particularly for dual-tone waveforms with bandwidth (tone separation) that is small or is close to the Nyquist rate, and when the delay yields discretization errors. We address this problem by combining a matched filter with least-squares (MF-LS) optimization. We compare this with a simple matched filter and interpolation approach, and with a matched filter and sinc-function nonlinear least squares (NL-LS) fit. We demonstrate that the MF-LS algorithm has lower bias errors than interpolation and NL-LS over both bandwidth and delay. We present experimental 2.8-GHz measurements of two-tone delay estimation implemented in a software-defined radio and demonstrate that the MF-LS algorithm achieves a reduction in root-mean-square error (RMSE) of nearly an order of magnitude compared with interpolation or NL-LS.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.