Abstract

The paradigmatic example of signals with finite rate of innovation (FRI) is a linear combination of a finite number of Diracs per time unit, a.k.a. spike sequence. Many researchers have investigated the problem of estimating the innovative part of a spike sequence, i.e., time instants tks and weights cks of Diracs and proposed various deterministic or stochastic algorithms, particularly while the samples were corrupted by digital noise. In the presence of noise, maximum likelihood estimation method proved to be a powerful tool for reconstructing FRI signals, which is inherently an optimization problem. Wein and Srinivasan presented an algorithm, namely IterML, for reconstruction of streams of Diracs in noisy situations, which achieved promising reconstruction error and runtime. However, IterML is prone to limited resolution of search grid for tk, so as to avoid a phenomenon known as the curse of dimensionality, that makes it an inappropriate algorithm for applications that require highly accurate reconstruction of time instants. In order to overcome this shortcoming, we introduce a novel modified local best particle swarm optimization (MLBPSO) algorithm aimed at maximizing likelihood estimation of innovative parameters of a sparse spike sequence given noisy low-pass filtered samples. We demonstrate via extensive simulations that MLBPSO algorithm outperforms the IterML in terms of robustness to noise and accuracy of estimated parameters while maintaining comparable computational cost.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call