Abstract

As to unsupervised learning, most discriminative information is encoded in the cluster labels. To obtain the pseudo labels, unsupervised feature selection methods usually utilize spectral clustering to generate them. Nonetheless, two related disadvantages exist accordingly: 1) the performance of feature selection highly depends on the constructed Laplacian matrix and 2) the pseudo labels are obtained with mixed signs, while the real ones should be nonnegative. To address this problem, a novel approach for unsupervised feature selection is proposed by extending orthogonal least square discriminant analysis (OLSDA) to the unsupervised case, such that nonnegative pseudo labels can be achieved. Additionally, an orthogonal constraint is imposed on the class indicator to hold the manifold structure. Furthermore, l2,1 regularization is imposed to ensure that the projection matrix is row sparse for efficient feature selection and proved to be equivalent to l2,0 regularization. Finally, extensive experiments on nine benchmark data sets are conducted to demonstrate the effectiveness of the proposed approach.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.