Abstract

We present a new algorithm for independent component analysis which has provable performance guarantees. In particular, suppose we are given samples of the form $$y = Ax + \eta $$y=Ax+? where $$A$$A is an unknown but non-singular $$n \times n$$n×n matrix, $$x$$x is a random variable whose coordinates are independent and have a fourth order moment strictly less than that of a standard Gaussian random variable and $$\eta $$? is an $$n$$n-dimensional Gaussian random variable with unknown covariance $$\varSigma $$Σ: We give an algorithm that provably recovers $$A$$A and $$\varSigma $$Σ up to an additive $$\epsilon $$∈ and whose running time and sample complexity are polynomial in $$n$$n and $$1 / \epsilon $$1/∈. To accomplish this, we introduce a novel quasi-whitening step that may be useful in other applications where there is additive Gaussian noise whose covariance is unknown. We also give a general framework for finding all local optima of a function (given an oracle for approximately finding just one) and this is a crucial step in our algorithm, one that has been overlooked in previous attempts, and allows us to control the accumulation of error when we find the columns of $$A$$A one by one via local search.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.