Abstract

Graph Laplacian manifold regularization exploits the intrinsic geometrical properties of the underlying graph over both labeled and unlabeled data to approximate an optimal smooth prediction function. The regularization works on the assumption that abundant unlabeled data add information to the model, which increases accuracy and generalization. However, a large number of unlabeled data points with few labeled data points results in function degeneration which restricts generalization. Higher order regularization through iterated Laplacian semi-norm resolves this problem, but it is not computationally intensive and not scalable. In this paper, we propose dropouts regularization by appropriately balancing the supervised loss and intrinsic regularization terms to enforce smoothness. This is done by increasing the labeled data points by assigning temporary soft-labels to a subset of unlabeled set iteratively. This is followed by the determination of the final labels. Localized Procrustes analysis is, then, used to increase the accuracy of label assignment. Experiments on both synthetic and real-world dataset show that the proposed model outperforms the existing graph Laplacian and its variants by a significant margin.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call