Abstract

Center loss is widely used as a supervision tool in deep learning method. However, the center loss also has some shortcomings, the most important of which is that it must be combined with softmax loss to run well. In this article, we sum up five shortcomings of center loss and solve all of them by proposing a dual distance center loss (DDCL). Compared with center loss, DDCL can run without the combination of softmax to supervise training the model. In addition, we verify the inconsistency between the proposed DDCL and softmax loss in the feature space. To be specifically, we add the Pearson distance on the basis of the Euclidean distance to the same center, which makes all features of the same class be confined to the intersection of a hypersphere and a hypercone in the feature space, strengthens the intraclass compactness of the center loss, and enhances the generalization ability of center loss. Moreover, by designing a Euclidean distance threshold between all center pairs, we not only strengthen the interclass separability of center loss, but also make the center loss (or DDCL) works well without the combination of softmax loss. We verify the effectiveness of DDCL in four datasets, two of which are widely used in the field of vehicle re-identification named VeRi-776 dataset and VehicleID dataset, and two other datasets are widely used in the field of person re-identification named Market1501 dataset and MSMT17 dataset. The experimental results of the proposed DDCL exceed that of the softmax loss in all the four datasets, indicating that our proposed method not only can run without the combination of softmax, but also has a high accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call