Abstract

A method to interpolate a bounded bandlimited signal from its own samples with minimum complexity is presented that guarantees an a priori specified accuracy. The method is the result of combining a fast-converging sampling expansion with the Farrow interpolator technique. It provides important complexity reductions in various signal processing applications, from which three cases are studied. For the Farrow interpolator, it assures that the interpolation error will be smaller than a known bound, both in the time and frequency domains. For delay estimation, the method allows one to decouple a given estimation/detection algorithm in two steps. In the first one, a few finite convolutions are carried out in order to compute a set of interpolation coefficients. In the second, the algorithm is executed but with a complexity independent of the length of the temporal observation interval. Besides, it is shown how to introduce this decoupling in the matched filter, MUSIC, and conditional maximum-likelihood delay estimators. Finally, the method is employed to give a simple solution to an important efficiency problem in the simulation of communications systems: the generation of Rayleigh processes

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.