Abstract

Deep Convolutional Neural Networks (DCNN) models have become popular in feature extraction tasks. One of the best approaches to effectively classify the features is to utilize the loss function. Softmax loss is one of the losses initiated from One-Shot Learning method; however, using Softmax loss is considered impractical for real-world tasks since the number of labels can change frequently and require re-train when new labels emerge. In contrast, many types of loss function from Similarity Learning method such as Intra loss, Inter loss, Triplet loss, and Margin loss can tackle those drawbacks. Among them, the Margin-Loss-based losses have recently claimed to be the most effective strategies; such as CosFace, by adding a fixed parameter as a cosine margin to eliminate the dependence of the normal distribution to the cosine of weights and feature vectors; as well as ArcFace, by adding addictive angular penalty margins in the cosine function to simultaneously enhance class compactness and class differentiation. However, we evaluated that cosine functions used from these strategies do not fully represent the angle when they get smaller due to the limitation on the range of values. We propose Large Margin Cotangent Loss (LMCot), which uses cotangent function instead of cosine function since cotangent perform better by the unlimited on the range of values. As a result, our proposed method has great potential to improve performance on verification and identification tasks. Furthermore, we also experimented on various datasets from accredited competitions, the results of LMCot advanced the state-of-the-art in some measurements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call