Abstract

An asymptotically robust invariant (ARI) algorithm for signal detection and time delay estimation is proposed. It is based on the q-point model of noise distributions. The algorithm is based on calculating the correlation statistics of the nonlinear transformation of the observed sample with the vector of reference signal samples. Time delay estimate is determined by special processing of the obtained statistics, taking into account the presence of mirror interference in the observed process. The algorithm is implemented in the frequency domain, which allows the use of the fast Fourier transform to reduce computational costs. The simulation results show that in the case of heavy-tailed noise distributions, ARI algorithm provides an energy gain compared to the classical correlation algorithm, and it has similar performance in the case of Gaussian noise.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call