Abstract

The loss function is a criterion to evaluate the learning quality of a deep convolutional neural network, which represents the gap between prediction and ground truth. However, as the most commonly used loss function in image classification tasks, Cross-Entropy loss does not encourage the model to distinguish the similarity between features. In this work, the authors investigate inter-class separability of similar features learnt by convolutional networks and propose a loss function called Error Refactor Loss (ER-Loss). ER-Loss is based on the error caused by convolutional networks; it can improve the inter-class separability and is simple to implement and can easily replace the Cross-Entropy loss. Compared with softmax loss, ER-Loss adds a dynamic penalty item which can help ER-Loss monitor the actual situation of model training and adjust the value of the penalty item according to model training. The ER-Loss on CIFAR100 and part of ImageNet ILSVRC 2012 is evaluated and the experimental result showed that the ER-Loss can improve the accuracy of the model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call