In this article, a novel method for high-precision time-delay estimation (TDE) of narrow-band signals is proposed. It is based on a cross-correlation function, phase spectrum, long short-term memory (LSTM) artificial neural network to unwrap the phase transform (PHAT) spectrum of the cross-correlation function. The PHAT-LSTM architecture consists of three parts. The first part is a wrapping parameter estimator (WPE) used to estimate the wrapping parameter of the base-band phase spectrum. The second part, a wrapping classifier (WCF), is a single output network used to compensate the drawbacks of the WPE. The third part, a synthesize and fine estimator, synthesizes the information from the WPE and WCF to unwrap the phase and estimate the delay according to the phase-delay model. The input of the PHAT-LSTM are fast Fourier transforms of snapshot data from two receiving channels. In addition, the dimension of the input signals was dramatically decreased compared with other deep learning-based TDE methods. Simulation results show that the root mean square error (RMSE) of the PHAT-LSTM is decreased in low signal-to-noise ratio (SNR) compared with traditional TDE methods. When the SNR = 10 dB or 0 dB, the RMSE of TDE was about ten times smaller than that of traditional methods.
Read full abstract