Abstract

Pisarenko's harmonic retrieval method involves determining the minimum eigenvalue and the corresponding eigenvector of the covariance matrix of the observed random process. Recently, Thompson [9] suggested a constrained gradient search procedure for obtaining an adaptive version of Pisarenko's method, and his simulations have verified that the frequency estimates provided by his procedure were unbiased. However, the main cost of this technique was that the initial convergence rate could be very slow for certain poor initial conditions. Restating the constrained minimization as an unconstrained nonlinear problem, we derived an alternative Gauss-Newton type recursive algorithm, which also used the second derivative matrix (or Hessian); this algorithm may also be viewed as an approximate least squares algorithm. Simulations have been performed to compare this algorithm to (a slight variant of) Thompson's original algorithm. The most important conclusions are that the least squares type algorithm has faster convergence in the beginning, while its convergence rate close to the true parameters depends on the signal-to-noise ratio of the input signal. The approximate least squares algorithm resolves the sinusoids much faster than the gradient version.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call