Abstract

The focus on evolutionary neural architecture search in this issue has driven me to ponder the evolution of biological neural networks, or rather, the human brain. Researchers in neuroanatomy found that the cognitive and mental development of humans is attributed to the increase of our brain size throughout evolution. Yet bigger is not always better. It was also discovered that, once the brain reaches a certain size, further growth will only render the brain less efficient. In addition, the brain is limited and affected by its inherent architecture and signal processing time. These discoveries in neuroscience are interestingly analogous to the advances in neural networks and evolutionary computation. Stacking of perceptrons empowers artificial neural networks to solve complex problems but detracts from their efficiency. Indeed, the human brain evolution and the artificial neural network evolution both face the trade-off between capacity and efficiency. The workings of nature truly give us a lot to mull over. This issue includes five Features articles. The first article proposes an evolutionary multi-objective model compression approach to simultaneously optimize the model size and accuracy of a deep neural network. The second and third articles adopt the notion of self-supervised learning in the neural architecture evolution, resulting in state-of-the-art performance. To improve the accuracy of wind speed forecasting, the fourth article uses an evolutionary algorithm to optimize the architecture of dendritic neural regression. The fifth article presents a self-adaptive mutation strategy for blockbased evolutionary neural architecture search on convolutional neural networks. In the Columns, the article proposes a multi-view feature construction method based on genetic programming and ensemble techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call