Abstract

We have previously proposed the use of quadratic Renyi's mutual information (1970), estimated using Parzen windowing, as an ICA criterion and showed that it utilizes data more efficiently than classical algorithms like InfoMax and FastICA. We suggested the use of Renyi's definition of information theoretic quantities rather than Shannon's definitions since Shannon's definitions are already included in Renyi's as special cases. In the estimation of probability densities using kernel methods, the choice of the kernel width is an important issue that affects the overall performance of the system, and there is no known way of determining the optimal value. Legendre polynomial expansion of a probability distribution, on the other hand, has two advantages. Hardware implementation is trivial and it does not require the choice of any parameter except for the point of truncation of the series. The rule for this assignment is simple: the longer the series, the more accurate the density estimation becomes. Thus, we combine these two schemes, namely Renyi's entropy and Legendre polynomial expansion for probability density function estimation to obtain a simple ICA algorithm. This algorithm is then tested on blind source separation, time-series analysis, and data reduction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.