Abstract

An algorithm for continuous time-delay estimation from sampled output data and a known input of finite energy is presented. The continuous time-delay modeling allows for the estimation of subsample delays. The proposed estimation algorithm consists of two steps. First, the continuous Laguerre spectrum of the output (delayed) signal is estimated from discrete-time (sampled) noisy measurements. Second, an estimate of the delay value is obtained via a Laguerre domain model using a continuous-time description of the input. The second step of the algorithm is shown to be intrinsically biased, the bias sources are established, and the bias itself is modeled. The proposed delay estimation approach is compared in a Monte-Carlo simulation with state-of-the-art methods implemented in time, frequency, and Laguerre domain demonstrating comparable or higher accuracy in the considered scenario.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call