Abstract
Convolutional neural networks (CNNs) have demonstrated great competence in feature representation, and then, achieved a good performance to many classification tasks. Cross-entropy loss, together with softmax, is arguably one of the most commonly used loss functions in CNNs (that is generally called softmax loss ). However, the softmax loss can result in a weakly discriminative feature representation since it focuses on the interclass separability rather than the intraclass compactness. This article proposes a pairwise Gaussian loss (PGL) for CNNs that can well address the intraclass compactness through significantly penalizing those similar sample pairs with a relatively large distance. At the same time, PGL can still ensure a good interclass separability. Experiments show that PGL can guarantee that CNNs obtain a better classification performance compared to not only the softmax loss but also others often used in CNNs. Our experiments also show that PGL has a stable convergence for the stochastic gradient descent optimization method in CNNs and a good generalization ability for different structures of CNNs.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.