Abstract

In general, introducing robust distance metrics and loss functions in the learning process can improve the robustness of the algorithms. In this work, we first propose a new robust loss function called adaptive capped Lθε-loss. For different problems, we can choose different loss functions through adaptive parameter θ during the learning process. Secondly, we propose a new robust distance metric induced by correntropy (CIM) that is based on Laplacian kernel. The CIM contains first and higher-order moments from samples. Further, we demonstrate some important and interesting properties of the Lθε-loss and CIM, such as robustness, boundedness, nonconvexity, etc. Finally, we apply the to Lθε-loss and CIM to twin support vector machine (TWSVM) and develop an adaptive robust learning framework, namely adaptive robust twin support vector machine (ARTSVM). The proposed ARTSVM not only inherits the advantages of TWSVM but also improves the robustness of classification problems. A non-convex optimization method, DC (difference of convex functions) programming algorithm (DCA) is used to solve the proposed ARTSVM, and the convergence of the algorithm is proved theoretically. Experiments on multiple datasets show that the proposed ARTSVM is competitive with existing methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call