Independent Vector Analysis (IVA) has emerged in recent years as an extension of Independent Component Analysis (ICA) into multiple sets of mixtures, where the source signals in each set are independent, but may depend on source signals in the other sets. In a semi-blind IVA (or ICA) framework, information regarding the probability distributions of the sources may be available, giving rise to Maximum Likelihood (ML) separation. In recent work we have shown that under the multivariate Gaussian model, with arbitrary temporal covariance matrices (stationary or non-stationary) of the source signals, ML separation requires the solution of a "Sequentially Drilled" Joint Congruence (SeDJoCo) transformation of a set of matrices, which is reminiscent of (but different from) classical joint diagonalization. In this paper we extend our results to the IVA problem, showing how the ML solution for the Gaussian model (with arbitrary covariance and cross-covariance matrices) takes the form of an extended SeDJoCo problem. We formulate the extended problem, derive a condition for the existence of a solution, and propose two iterative solution algorithms. In addition, we derive the induced Cram\'er-Rao Lower Bound (iCRLB) on the resulting Interference-to-Source Ratios (ISR) matrices, and demonstrate by simulation how the ML separation obtained by solving the extended SeDJoCo problem indeed attains the iCRLB (asymptotically), as opposed to other separation approaches, which cannot exploit prior knowledge regarding the sources' distributions.
Read full abstract