Abstract

We consider the estimation of the data model of independent component analysis when Gaussian noise is present. We show that the joint maximum likelihood estimation of the independent components and the mixing matrix leads to an objective function already proposed by Olshausen and Field using a different derivation. Due to the complicated nature of the objective function, we introduce approximations that greatly simplify the optimization problem. We show that the presence of noise implies that the relation between the observed data and the estimates of the independent components is non-linear, and show how to approximate this non-linearity. In particular, the non-linearity may be approximated by a simple shrinkage operation in the case of super-Gaussian (sparse) data. Using these approximations, we propose an efficient algorithm for approximate maximization of the likelihood. In the case of super-Gaussian components, this may be approximated by simple competitive learning, and in the case of sub-Gaussian components, by anti-competitive learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call