Abstract

Minimum mean squared error (MMSE) estimators of signals from samples corrupted by jitter (timing noise) and additive noise are nonlinear, even when the signal prior and additive noise have normal distributions. This paper develops a stochastic algorithm based on Gibbs sampling and slice sampling to approximate the optimal MMSE estimator in this Bayesian formulation. Simulations demonstrate that this nonlinear algorithm can improve significantly upon the linear MMSE estimator, as well as the EM algorithm approximation to the maximum likelihood (ML) estimator used in classical estimation. Effective off-chip post-processing to mitigate jitter enables greater jitter to be tolerated, potentially reducing on-chip ADC power consumption.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.