Abstract

Recently, deep neural networks (DNNs) promote mainly by network architectures and loss functions; however, the development of neuron models has been quite limited. In this study, inspired by the mechanism of human cognition, a hyper-sausage coverage function (HSCF) neuron model possessing a high flexible plasticity. Then, a novel cross-entropy and volume-coverage (CE_VC) loss is defined, which compresses the volume of the hyper-sausage to the hilt, and helps alleviate confusion among different classes, thus ensuring the intra-class compactness of the samples. Finally, a divisive iteration method is introduced, which considers each neuron model as a weak classifier, and iteratively increases the number of weak classifiers. Thus, the optimal number of the HSCF neuron is adaptively determined and an end-to-end learning framework is constructed. In particular, to improve the classification performance, the HSCF neuron can be applied to classical DNNs. Comprehensive experiments on eight datasets in several domains demonstrate the effectiveness of the proposed method. The proposed method exhibits the feasibility of boosting DNNs with neuron plasticity and provides a novel perspective for further developments in DNNs. The source code is available at https://github.com/Tough2011/HSCFNet.git .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call