Abstract

Rank determination issue is one of the most significant issues in non-negative matrix factorization (NMF) research. However, rank determination problem has not received so much emphasis as sparseness regularization problem. Usually, the rank of base matrix needs to be assumed. In this paper, we propose an unsupervised multi-level non-negative matrix factorization model to extract the hidden data structure and seek the rank of base matrix. From machine learning point of view, the learning result depends on its prior knowledge. In our unsupervised multi-level model, we construct a three-level data structure for non-negative matrix factorization algorithm. Such a construction could apply more prior knowledge to the algorithm and obtain a better approximation of real data structure. The final bases selection is achieved through L2-norm optimization. We implement our experiment via binary datasets. The results demonstrate that our approach is able to retrieve the hidden structure of data, thus determine the correct rank of base matrix.

Highlights

  • For negative matrix factorization (NMF) research, the cost function and initialization problems of NMF are the main issues for researchers

  • We propose an unsupervised multi-level non-negative matrix factorization model to extract the hidden data structure and seek the rank of base matrix

  • The results demonstrate that our approach is able to retrieve the hidden structure of data, determine the correct rank of base matrix

Read more

Summary

Introduction

For NMF research, the cost function and initialization problems of NMF are the main issues for researchers. The main challenge of rank determination problem is that it is pre-defined. It is hard to know the correct rank of base matrix before the updating process of components. That is why the canonical NMF method and traditional probabilistic methods (ML, MAP) cannot handle the rank determination problem. In this paper, we propose an unsupervised multi-level model to automatically seek the correct rank of base matrix. We use L2-norm to show the contribution of hyper-prior in correct bases learning procedure. Experimental results on two binary datasets demonstrate that our method is efficient and robust.

Related Work
Unsupervised Multi-Level Non-Negative Matrix Factorization Model
Model Construction
Inference
Experimental Results and Evaluation
Fence Dataset
Swimmer Dataset
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.