Abstract

In this paper, a new algorithm for approximate joint diagonalization (AJD) of positive-definite Hermitian matrices is presented. The AJD matrix, which is assumed to be square non-unitary, is derived via minimization of a quasi-maximum likelihood (QML) objective function. This objective function coincides asymptotically with the maximum likelihood (ML) objective function, hence enabling the proposed algorithm to asymptotically approach the ML estimation performance. In the proposed method, the rows of the AJD matrix are obtained independently, in an iterative manner. This feature enables direct estimation of full row-rank rectangular AJD sub-matrices. Under some mild assumptions, convergence of the proposed algorithm is asymptotically guarantied, such that the error norm corresponding to each row of the AJD matrix reduces significantly after the first iteration, and the convergence is almost Q-super linear. This property results rapid convergence, which leads to low computational load in the proposed method. The performance of the proposed algorithm is evaluated and compared to other state-of-the-art algorithms for AJD and its practical use is demonstrated in the blind source separation and blind source extraction problems. The results imply that under the assumptions of high signal-to-noise ratio and large amount of matrices, the proposed algorithm is computationally efficient with performance similar to state-of-the-art algorithms for AJD.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call