Abstract
Semi-Supervised Learning (SSL) has become a topic of recent research that effectively addresses the problem of limited labeled data. Many SSL methods have been developed based on the manifold assumption, among them, the Local and Global Consistency (LGC) is a popular method. The problem with most of these algorithms, and in particular with LGC, is the fact that their naive implementations do not scale well to the size of data. Time and memory limitations are the major problems faced in large-scale problems. In this paper, we provide theoretical bounds on gradient descent, and to overcome the aforementioned problems, a new approximate Newton's method is proposed. Moreover, convergence analysis and theoretical bounds for time complexity of the proposed method is provided. We claim that the number of iterations in the proposed methods, logarithmically depends on the number of data, which is a considerable improvement compared to the naive implementations. Experimental results on real world datasets confirm superiority of the proposed methods over LGC's default iterative implementation and the state of the art factorization method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.