Abstract

The method of interval errors (MIE) predicts mean-squared error (MSE) performance at low signal-to-noise ratios (SNR) where global errors dominate. It is algorithm specific and enabled by an estimate of asymptotic MSE performance and sidelobe error probabilities. Parameter bounds are adequate representations of the asymptotic MSE in absence of signal model mismatch, but Taylor theorem can account for this mismatch. Herein limitations of bounds versus Taylor's theorem to represent the asymptotic MSE of nonlinear schemes like maximum-likelihood are discussed. Use of first-order Taylor expansions for the purpose of improved approximation of sidelobe error probability is likewise explored.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call