Abstract

In this paper, we first propose a novel robust loss function called adaptive capped Lθε-loss. The Lθε-loss has some interesting properties, such as robustness, non-convexity, and boundedness. During the learning process, for different problems, we can choose different loss functions through adaptive parameter θ. Then, a new robust twin extreme learning machine (RTELM) framework is presented by applying Lθε-loss and capped L1-norm distance metric. Compared with the twin extreme learning machine (TELM), RTELM overcomes the disadvantages of L2-norm distance metric and hinge loss, especially for the problem with outliers, while inherits the advantages of TELM. Further, we present a new Laplacian RTELM (Lap-RTELM for short) by introducing manifold regularization terms into RTELM. Intuitively, the Lap-RTELM can effectively utilize geometric information embedded in unlabeled samples and merge them as manifold regularization terms to learn a more reasonable classifier for semi-supervised classification (SSC) problems. Finally, two effective iterative algorithms are designed to solve the challenges brought by the non-convex optimization problems RTELM and Lap-RTELM, and theoretically guarantee the convergence, local optimality, and computational complexity of algorithms. Experiments on multiple datasets show that the proposed RTELM and Lap-RTELM are competitive with existing methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call