Abstract

Multinomial logistic regression (MLR) is an effective classifier in spatial–spectral-based hyperspectral image (HSI) classification. However, in some typical scenarios, such as Gaussian regularized MLR (GSMLR) and Laplacian graph regularized MLR (LPMLR), it hits a large ( cd ) × ( cd ) linear system during the regressors learning procedure that is unbearable in both space and time complexity (c is the number of classes and d is the length of feature). Even if using middle-sized features, it often runs out of memory. To this end, we propose two exact divide-and-conquer (DC) algorithms, DC-GSMLR and DC-LPMLR, to reduce the computation complexity. Both decompose the regressors learning problem into a series of equivalent smaller subproblems, each of which can be solved in closed form. Unlike the approximation ones available, they provide exact merged solutions instead. With the same accuracy, DC-LPMLR and DC-GSMLR only need to solve c + 1 and 2 d × d linear systems, respectively, significantly reducing the peak memory usage by almost O ( c ) and O ( c2 / 2 ) times. For time, experiments on two popular HSI datasets indicate considerable speedup ratio as high as one or two orders of magnitude, showing the practicability in real applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call