Abstract

Metric and similarity learning are important approaches to classification and retrieval. To efficiently learn a distance metric or a similarity function, online learning algorithms have been widely applied. In general, however, existing online metric and similarity learning algorithms have limited performance in real-world classification and retrieval applications. In this paper, we introduce a convergent online metric learning model named scalable large margin online metric learning (SLMOML). SLMOML belongs to the passive-aggressive learning family. At each step, it adopts the LogDet divergence to maintain the closeness between two successively learned Mahalanobis matrices, and utilizes the hinge loss to enforce a large margin between relatively dissimilar samples. More importantly, the Mahalanobis matrix can be updated by closed-form solution at each step. Furthermore, if the initial matrix is positive semi-definite (PSD), the learned matrices in the following steps are always PSD. Based on the Karush–Kuhn–Tucker conditions and the built equivalence between the passive-aggressive learning family and the Bregman projections, we have proved the global convergence of SLMOML. Extensive experiments on classification and retrieval tasks demonstrate the effectiveness and efficiency of SLMOML.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.