Abstract

This paper is focused on the interpolation of a signal modeled by a random process from a set of discrete-time measurements. The process of signal sampling is studied as a conditioning of a random process at instants of its discrete-time observations. The analysis shows that even if an input is modeled by a stationary Gaussian process, the conditional random process with a set of observations at sampling instants, is still Gaussian but non-stationary. By the adoption of standard properties of the multivariate normal distribution, we derive the mean of the conditional process, which is at the same time the minimum mean-square error (MMSE) predictor for signal reconstruction based on the given information represented by the observations. It is shown that for Gaussian signals, the MMSE predictor is a linear function of the observed data. For bandlimited signals, the conditional MMSE predictor coincides to the well-known MMSE reconstruction derived by Yen based on the deterministic approach. Although the approach covers any measurement scheme, possibly non-uniform in time, the study is narrowed down to the interpolation of the signal from its level-crossing samples. The accuracy of the MMSE reconstruction has been verified by simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call