Abstract

In this paper, we address the problem of redundancy reduction of high-dimensional noisy signals that may contain anomaly (rare) vectors, which we wish to preserve. Since anomaly data vectors contribute weakly to the l2-norm of the signal as compared to the noise, l2 -based criteria are unsatisfactory for obtaining a good representation of these vectors. As a remedy, a new approach, named Min-Max-SVD (MX-SVD) was recently proposed for signal-subspace estimation by attempting to minimize the maximum of data-residual l2-norms, denoted as l2,l and designed to represent well both abundant and anomaly measurements. However, the MX-SVD algorithm is greedy and only approximately minimizes the proposed l2,l-norm of the residuals. In this paper we develop an optimal algorithm for the minization of the l2,l-norm of data misrepresentation residuals, which we call Maximum Orthogonal complements Optimal Subspace Estimation (MOOSE). The optimization is performed via a natural conjugate gradient learning approach carried out on the set of n dimensional subspaces in IRm, m ≫ n, which is a Grassmann manifold. The results of applying MOOSE, MX-SVD, and l2- based approaches are demonstrated both on simulated and real hyperspectral data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.