Abstract

Sufficient dimension reduction (SDR) is a popular framework for supervised dimension reduction, aiming at reducing the dimensionality of input data while information on output data is maximally maintained. On the other hand, in many recent supervised classification learning tasks, it is conceivable that the balance of samples in each class varies between the training and testing phases. Such a phenomenon, referred to as class-prior change, causes existing SDR methods to perform undesirably particularly when the training data is highly imbalanced. In this paper, we extend the state-of-the-art SDR method called leastsquares gradients for dimension reduction (LSGDR) to be able to cope with such class-prior change under the semi-supervised learning setup where unlabeled test data are available in addition to labeled training data. Through experiments, we demonstrate the usefulness of our proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call