Abstract

AbstractNonnegative matrix factorization (NMF) decomposes any nonnegative matrix into the product of two low dimensional nonnegative matrices. Traditional NMF has the risk learning rank-deficient based on high-dimensional dataset. In this work, we propose a new class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) based on the family of Alpha-Beta logarithm determinant (AB log-det) divergences, which are parametrized by two parameters (α and β) and the logarithm determinant function. We have developed a multiplicative updating rule to optimize AB log-det in the frame of block coordinate descent, which theoretically proves its convergence. Experimental results on popular datasts show the AB-log-det NMF that have been introduced, present more advanced than traditional NMF methods. The proposed family of AB log-det-NMF multiplicative updating rules algorithms is shown to improve robustness. The AB log-det-divergences are very promising for the rank reduction problem.KeywordsNonnegative matrix factorization (NMF)Robust multiplicative NMF algorithmsSimilarity measures generalized divergencesAlpha-Beta log-det divergences

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call