Abstract

Bayesian Kullback Ying–Yang dependence reduction system and theory is presented. Via stochastic approximation, implementable algorithms and criteria are given for parameter learning and model selection, respectively. Three typical architectures are further studied on several special cases. The forward one is a general information theoretic dependence reduction model that maps an observation x into a representation y of k independent components, with k detectable by criteria. For the special cases of invertible map x→ y, a general adaptive algorithm is obtained, which not only is applicable to nonlinear or post-nonlinear mixtures, but also provides an adaptive EM algorithm that implements the previously proposed learned parametric mixture method for independent component analysis (ICA) on linear mixtures. The backward architecture provides a maximum likelihood independent factor model for modeling observations from unknown number of independent factors via a linear or nonlinear system under noisy situations. For the special cases of linear or post-nonlinear mixture under Gaussian noise, the simplified adaptive algorithm and the criterion for detecting k are given, with an approximately optimal linear mapping x→ y suggested. Moreover, if the independent factors are assumed to be standard Gaussians, we are further led to the conventional factor analysis, but with a new adaptive algorithm for its estimation and a criterion for deciding the number of factors. The bi-directional architecture combines the advantages of backward and forward ones. A mean field approximation is presented, with a simplified adaptive parameter learning algorithm and an approximate k-selection criterion. Moreover, its special cases lead to the existing least mean square error reconstruction learning and the one hidden layer deterministic Helmholtz machine, with new findings. Also, a specific degenerate case of bi-directional architecture results in a non-invertible but adaptively implementable forward on to mapping for ICA. Experiments on binary sources are demonstrated with successes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call