Abstract

Deep metric learning aims to create a feature space where the projected samples have maximised inter-class and minimised intra-class distance. Most approaches employ only distance-based metrics to achieve this objective, but neglect other properties of the projections in the embedding space, such as their density, sparsity and presence of outliers. In this paper, we propose a novel density-based regulariser, LOFReg, designed to be used as a complement to previously proposed distance-based metric learning loss functions for the re-identification (ReID) and few-shot classification (FSC) tasks. Our method is based on the well-known, in anomaly detection literature, local outlier factor (LOF) algorithm, which estimates the local density deviation of a data point with respect to its neighbours. These measurements are used in our regularisation methodology to achieve an embedding space of evenly distributed samples and to increase the generalisation ability of the model compared to solely distance-based learning. Comprehensive experiments on four publicly available datasets for ReID and FSC, demonstrate consistent improvement against previously proposed metric learning loss functions. Particularly, our experiments show up to 5% improvement in ReID settings, up to 13% in FSC settings and up to 6% improvement against other state-of-art regularisers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call