Abstract

Non-negative matrix factorization (NMF) is a fundamental theory that has received much attention and is widely used in image engineering, pattern recognition and other fields. However, the classical NMF has limitations such as only focusing on local information, sensitivity to noise and small sample size (SSS) problems. Therefore, how to develop the NMF to improve the performance and robustness of the algorithm is a worthy challenge. Based on the bottlenecks above, we propose an exponential graph regularization non-negative low-rank factorization algorithm (EGNLRF) combining sparseness, low rank and matrix exponential. Firstly, based on the assumption that the data is corroded, we decompose a given raw data item with a data error fitting noise matrix, applying a low-rank constraint to the denoising data. Then, we perform a non-negative factorization on the resulting low-rank matrix, from which we derive the low-dimensional representation of the original matrix. Finally, we use the low-dimensional representation for graph embedding to maintain the geometry between samples. The graph embedding terms are matrix exponentiated to cope with SSS problems and nearest neighbor sensitivity. The above three steps will be incorporated into a joint framework to validate and optimize each other; therefore, we can learn latent data representations that are undisturbed by noise and preserve the local structure of known samples. We conducted simulation experiments on different datasets and verified the effectiveness of the algorithm by comparing the proposed with the lasting ones related to NMF, low rank and graph embedding.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call