Convolutional Neural Networks (CNNs) have been on the forefront of neural network research in recent years. Their breakthrough performance in fields such as image classification has gathered efforts in the development of new CNN-based architectures, but recently more attention has been directed to the study of new loss functions. Softmax loss remains the most popular loss function due mainly to its efficiency in class separation, but the function is unsatisfactory in terms of intra-class compactness. While some studies have addressed this problem, most solutions attempt to refine softmax loss or combine it with other approaches. We present a novel loss function based on distance matrices (LDMAT), softmax independent, that maximizes interclass distance and minimizes intraclass distance. The loss function operates directly on deep features, allowing their use on arbitrary classifiers. LDMAT minimizes the distance between two distance matrices, one constructed with the model’s deep features and the other calculated from the labels. The use of a distance matrix in the loss function allows a two-dimensional representation of features and imposes a fixed distance between classes, while improving intra-class compactness. A regularization method applied to the distance matrix of labels is also presented, that allows a degree of relaxation of the solution and leads to a better spreading of features in the separation space. Efficient feature extraction was observed on datasets such as MNIST, CIFAR10 and CIFAR100.
Read full abstract