Abstract

At the heart of many ICA techniques is a nonparametric estimate of an information measure, usually via nonparametric density estimation, for example, kernel density estimation. While not as popular as kernel density estimators, orthogonal functions can be used for nonparametric density estimation (via a truncated series expansion whose coefficients are calculated from the observed data). While such estimators do not necessarily yield a valid density, which kernel density estimators do, they are faster to calculate than kernel density estimators, in particular for a modified version of Renyi's entropy of order 2. In this paper, we compare the performance of ICA using Hermite series based estimates of Shannon's and Renyi's mutual information, to that of Gaussian kernel based estimates. The comparisons also include ICA using the RADICAL estimate of Shannon's entropy and a FastICA estimate of neg-entropy.

Highlights

  • Many of the techniques used to perform Independent Component Analysis (ICA) minimize an “information” measure

  • One of the pioneering papers for ICA [4], uses the Gram-Charlier expansion to give an approximation of differential entropy in terms of third and fourth order cumulants; FastICA [7] approximates neg-entropy, which is a measure of non-gaussianity, with a number of non-linear functions; RADICAL [10] uses order statistics to estimate differential entropy; in [3] kernel density estimators are used to estimate differential entropy; in [6] kernel density estimators are used to estimate Renyi’s mutual information; in [5], nonparametric density estimation using Legendre polynomials are used to Entropy 2008, 10 estimate Renyi’s mutual information

  • A variety of criteria exist for choosing M, see [9] for a review. This basis expansion density estimate, may not be a proper density in terms on non-negativity and summation to 1 - unlike kernel density estimators. They are much faster to calculate than kernel density estimators for large N

Read more

Summary

Introduction

Many of the techniques used to perform Independent Component Analysis (ICA) minimize an “information” measure. Renyi’s mutual information, to that of Gaussian kernel based estimates. Renyi’s mutual information, to that of performing ICA by nonparametric estimation of Shannon’s and I=1 for a random variable X with density p(x) and a set of basis functions {ξn (x)}n=0 which are orthogonal with respect to a kernel K(x), an estimate of p(x) is p(x) =

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call