Abstract
This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming more and more computing resources. This calls for a strong need to prune neurons that do not contribute much to their network's output. Pruning those with low contribution may lead to a loss of accuracy of DNM. Our proposed method is novel because 1) it can reduce the number of dendrites in DNM while improving training efficiency without affecting accuracy and 2) it can select proper initialization weight and threshold of neurons. The Adam algorithm is used to train DNM after its initialization with our proposed DT-based method. To verify its effectiveness, we apply it to seven benchmark datasets. The results show that decision-tree-initialized DNM is significantly better than the original DNM, k-nearest neighbor, support vector machine, back-propagation neural network, and DT classification methods. It exhibits the lowest model complexity and highest training speed without losing any accuracy. The interactions among attributes can also be observed in its dendritic neurons.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Neural Networks and Learning Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.