Abstract

Large-scale artificial neural networks have many redundant structures, making the network fall into the issue of local optimization and extended training time. Moreover, existing neural network topology optimization algorithms have the disadvantage of many calculations and complex network structure modeling. We propose a Dynamic Node-based neural network Structure optimization algorithm (DNS) to handle these issues. DNS consists of two steps: the generation step and the pruning step. In the generation step, the network generates hidden layers layer by layer until accuracy reaches the threshold. Then, the network uses a pruning algorithm based on Hebb’s rule or Pearson’s correlation for adaptation in the pruning step. In addition, we combine genetic algorithm to optimize DNS (GA-DNS). Experimental results show that compared with traditional neural network topology optimization algorithms, GA-DNS can generate neural networks with higher construction efficiency, lower structure complexity, and higher classification accuracy.

Highlights

  • Nowadays, artificial neural networks scale rapidly as performance increases

  • We used the Pearson correlation coefficient [3] to measure the correlation of neurons in adjacent layers and designed our Dynamic Node-based neural network Structure optimization algorithm (DNS)

  • Easy modeling, and strong usability are all essential contents in exploring neural network generation algorithms

Read more

Summary

Introduction

Traditional artificial neural networks use full connections between layers, which leads to redundant structures that waste hardware and software resources. Inspired by the in-generation constructing process of biological neural networks, we presented an adaptive neural network algorithm based on dynamic nodes. In order to remove redundant network structures and make the network neurons adaptive, we need to establish a way to measure the correlation between neurons. We used the Pearson correlation coefficient [3] to measure the correlation of neurons in adjacent layers and designed our Dynamic Node-based neural network Structure optimization algorithm (DNS). The network uses Hebb’s rule or Pearson’s correlation to measure the importance of connections and prune according to this. In DNS, both the number of neurons in the network and the structure of synapses are dynamically changed, and both adapt to the current task

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call