Abstract

Independent component analysis (ICA) is a signal processing technique in which a set of random variables are represented in terms of a set of underlying independent component variables. The most central application is the blind source separation for time domain signals. Most approaches to the ICA problem start from information theoretic criteria like maximum likelihood or maximum entropy, and result in numerical on-line learning algorithms. We emphasize here the connection of ICA to neural learning, especially constrained Hebbian learning rules that are nonlinear extensions of principal component analysis learning rules introduced by the author. We review results showing that the nonlinearities in the learning rules are not critical but can be chosen so that the learning rules not only produce independent components, but also have other desirable properties like robustness or fast convergence, contrary to the often used polynomial functions ensuing from cumulant expansions. Also fast batch versions of the learning rules have been developed. Some results are given on the stationary points and their asymptotic stability. It is pointed out that sigmoid-shaped nonlinear functions are a good choice from several points of view.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.