Abstract

In this paper we will provide a theory for an extrapolation algorithm of band-limited signals with sampling errors or low noises. The main result is as follows: Suppose thatf(t) is in L2 and is a continuous Ω band-limited signal. Then for any positive numbers T, A(>T) and e, there exists a δ>0 such that when the sampling errors (or noises) off(t) on [−T, T] is less than δ we can extrapolate the values off(t) on the interval [−A, A] such that the difference between the extrapolated value and the exact value off(t) does not exceed e.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call