Abstract

Most of source separation methods focus on stationary sources, so higher-order statistics is necessary for successful separation, unless sources are temporally correlated. For nonstationary sources, however, it was shown [Neural Networks 8 (1995) 411] that source separation could be achieved by second-order decorrelation. In this paper, we consider the cost function proposed by Matsuoka et al. [Neural Networks 8 (1995) 411] and derive natural gradient learning algorithms for both fully connected recurrent network and feedforward network. Since our algorithms employ the natural gradient method, they possess the equivariant property and find a steepest descent direction unlike the algorithm [Neural Networks 8 (1995) 411]. We also show that our algorithms are always locally stable, regardless of probability distributions of nonstationary sources.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.