Abstract

Kernel nonnegative matrix factorization (KNMF) is an extension of NMF designed to capture nonlinear dependence features in data matrix through kernel functions. In KNMF, the size of the kernel matrices is closely associated with the input data matrix, of which the calculation consumes a large amount of memory and computing resource. When applied on large-scale hyperspectral data, KNMF often meets the bottleneck of memory and may cause the overflow of memory. And when dealing with dynamically acquired data, KNMF requires recomputation of the whole data set when newly acquired data arrived, which produces huge memory and computing resource requirements. To reduce the usage of memory and improve the computational efficiency when applying KNMF on large scale and dynamic hyperspectral data, we extend KNMF by introducing partition matrix theory and considering the relationships among dividing blocks. The decomposition results of hyperspectral data are derived from much smaller scale matrices containing the formerly achieved results and the newly data blocks incrementally. In this paper, we propose an incremental KNMF (IKNMF) to reduce the computing requirements for large-scale data in hyperspectral unmixing. An improved IKNMF (IIKNMF) is also proposed to further improve the abundance results of IKNMF. Experiments are conducted on both synthetic and real hyperspectral data sets. The experimental results demonstrate that the proposed methods can effectively save memory resources without degrading the unmixing performance and the proposed IIKNMF can achieve better abundance results than IKNMF and KNMF.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call