Abstract

As two kinds of popular data mining methods, metric learning and SVM have a interesting and valuable internal relationship. The basic idea of metric learning is to learn a data-dependent metric, instead of Euclidean metric, to shrink the distances between similar points and extend the distances between dissimilar points. From a different view, LSSVM can reach a similar goal as metric learning. It finds two parallel hyperplanes to make the distances between points and corresponding hyperplane as small as possible and the distance between two hyperplanes as large as possible. LSSVM can be looked as a slack version of metric learning. Then, it can be improved by modifying the way in measuring between-class distance, lead to the raise of our novel approach ML-LSSVM, which adds constraints of inter-class distance into LSSVM. Alternating direction method of multipliers algorithm was implemented to solve ML-LSSVM effectively, much faster than handling the original quadratic convex programming problem. Experiments were made to validate the efficacy of ML-LSSVM and prove that different measurements of intra-class distance and inter-class distance have significant impact on classification. At last, the relation between LMNN and ML-LSSVM was discussed to illustrate that the local formulation of LMNN is equivalent to ML-LSSVM.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.