Abstract

An interval error-based method (MIE) of predicting mean squared error (MSE) performance of maximum-likelihood estimators (MLEs) is extended to the case of signal parameter estimation requiring intermediate estimation of an unknown colored noise covariance matrix; an intermediate step central to adaptive array detection and parameter estimation. The successful application of MIE requires good approximations of two quantities: 1) interval error probabilities and 2) asymptotic (SNRrarrinfin) local MSE performance of the MLE. Exact general expressions for the pairwise error probabilities that include the effects of signal model mismatch are derived herein, that in conjunction with the Union Bound provide accurate prediction of the required interval error probabilities. The Crameacuter-Rao Bound (CRB) often provides adequate prediction of the asymptotic local MSE performance of MLE. The signal parameters, however, are decoupled from the colored noise parameters in the Fisher Information Matrix for the deterministic signal model, rendering the CRB incapable of reflecting loss due to colored noise covariance estimation. A new modification of the CRB involving a complex central beta random variable different from, but analogous to the Reed, Mallett, and Brennan beta loss factor provides a working solution to this problem, facilitating MSE prediction well into the threshold region with remarkable accuracy

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call