Abstract

We consider the blind source separation (BSS) problem and the closely related approximate joint diagonalization (AJD) problem of symmetric positive definite (SPD) matrices. These two problems can be reduced to an optimization problem with three key components: the criterion to minimize, the constraint on the solution, and the optimization algorithm to solve it. This paper contains two contributions that allow us to treat these issues independently. We build the first complete Riemannian optimization framework suited for BSS and AJD handling three classical constraints, and allowing to use a large panel of general optimization algorithms on manifolds. We also perform a thorough study of the AJD problem of SPD matrices from an information geometry point of view. We study AJD criteria based on several divergences of the set of SPD matrices, provide three optimization strategies to minimize them, and analyze their properties. Our numerical experiments on simulated and pseudoreal electroencephalographic data show the interest of the Riemannian optimization framework and of the different AJD criteria we consider.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call