Abstract

Nonnegative matrix factorization (NMF) is a powerful tool in data analysis by discovering latent features and part-based patterns from high-dimensional data, and is a special case in which factor matrices have low-rank nonnegative constraints. Applying NMF into huge-size matrices, we specifically address stochastic multiplicative update (MU) rule, which is the most popular, but which has slow convergence property. This present paper introduces a gradient averaging technique of stochastic gradient on the stochastic MU rule, and proposes an accelerated stochastic multiplicative update rule: SAGMU. Extensive computational experiments using both synthetic and real-world datasets demonstrate the effectiveness of SAGMU.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.