Abstract

Maximum Likelihood (ML) estimation requires precise knowledge of the underlying statistical model. In Quasi ML (QML), a presumed model is used as a substitute to the (unknown) true model. In the context of Independent Vector Analysis (IVA), we consider the Gaussian QML Estimate (QMLE) of the demixing matrices set and present an (approximate) analysis of its asymptotic separation performance. In Gaussian QML the sources are presumed to be Gaussian, with covariance matrices specified by some "educated guess". The resulting quasi-likelihood equations of the demixing matrices take a special form, recently termed an extended "Sequentially Drilled" Joint Congruence (SeDJoCo) transformation, which is reminiscent of (though essentially different from) classical joint diagonalization. We show that asymptotically this QMLE, i.e., the solution of the resulting extended SeDJoCo transformation, attains perfect separation (under some mild conditions) regardless of the sources' true distributions and/or covariance matrices. In addition, based on the "small-errors" assumption, we present a first-order perturbation analysis of the extended SeDJoCo solution. Using the resulting closed-form expressions for the errors in the solution matrices, we provide closed-form expressions for the resulting Interference-to-Source Ratios (ISRs) for IVA. Moreover, we prove that asymptotically the ISRs depend only on the sources' covariances, and not on their specific distributions. As an immediate consequence of this result, we provide an asymptotically attainable lower bound on the resulting ISRs. We also present empirical results, corroborating our analytical derivations, of three simulation experiments concerning two possible model errors - inaccurate covariance matrices and sources' distribution mismodeling.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call