Abstract

Dimensionality reduction is an efficient method for alleviating the issue of dimensionality in high-dimensional data. As a popular self-supervised learning method, contrastive learning has recently garnered considerable attention. In this paper, we propose NCLDR: Nearest-Neighbor Contrastive Learning with Dual Correlation Loss for Dimensionality Reduction, a novel dimensionality reduction method that is porting a contrastive learning framework to the specific task of dimensionality reduction. Firstly, NCLDR uses the nearest-neighbor to construct feature pairs from the training set itself. Afterwards, to decorrelate feature variables that produce representations invariant across such pairs, a basic multi-layer perceptron (MLP) network architecture with a dual correlation loss function is designed. Compared to most dimensionality reduction methods, NCLDR bypasses the complexity of optimizing kNN graphs and facilitates the embedding of out-of-sample data. Additionally, it also alleviates the issue of “dimensional collapse” in the low-dimensional representation space. Finally, experimental results demonstrate that the proposed method achieves significant improvements over state-of-the-art dimensionality reduction methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.