Deep metric learning (DML) achieves excellent performance in many open-set scenarios. However, the current multi-proxy methods rely on a classification framework and the performances of these methods rely on large batch size due to the representation limitation of mini-batch for neighbor distribution of proxies. This study proposes a multi-proxy method aiming to enhance the generalization ability of DML on unseen classes by enhancing the data representation ability. As a sample may not share all fine-grained semantic information of a class, our model does not require full association between samples and proxies, and is updated only according to the associated pairs. Moreover, a mini-batch may not contain sufficient information for learning. So our model also considers the proxy-proxy relation in order to provide a global view of the data structure which improves the learning on the intra-class distance. This study also investigates how the positive and negative margins, i.e., the parameters of our model which control the required similarity for a class and different classes, affect our performance, and finds that less demanding on the negative margin avoids over-training and improves generalization ability to unseen classes. The outstanding performance of our model is confirmed by experimental results compared with the state-of-the-art DML methods in the benchmark image retrieval datasets.
Read full abstract