Abstract

This paper contributes to study the influence of various NMF algorithms on the classification accuracy of each classifier as well as to compare the classifiers among themselves. We focus on a fast nonnegative matrix factorization (NMF) algorithm based on discrete-time projection neural network (DTPNN). The NMF algorithm is combined with three classifiers in order to find out the influence of dimensionality reduction performed by the NMF algorithm on the accuracy rate of the classifiers. The convergent objective function values in terms of two popular objective functions, Frobenius norm and Kullback–Leibler (K-L) divergence for different NMF based algorithms on a wide range of data sets are demonstrated. The CPU running time in terms of these objective functions on different combination of NMF algorithms and data sets are also shown. Moreover, the convergent behaviors of different NMF methods are illustrated. In order to test its effectiveness on classification accuracy, a performance study of three well-known classifiers is carried out and the influence of the NMF algorithm on the accuracy is evaluated. Furthermore, the confusion matrix module has been incorporated into the algorithms to provide additional classification accuracy comparison.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call