Abstract

Evolutionary Neural Architecture Search (ENAS) is a promising method for the automated design of deep network architecture, which has attracted extensive attention in the field of automated machine learning. However, the existing ENAS methods often need a lot of computing resources to design CNN architecture automatically. In order to achieve efficient and automated design of CNNs, this paper focuses on two aspects to improve efficiency. On the one hand, efficient CNN-based building blocks are introduced to ensure the effectiveness of the generated architectures and a triplet attention mechanism is incorporated into the architectures to further improve the classification performance. On the other hand, a random forest-based performance predictor is used in the fitness evaluation to reduce the amount of computation required to train each individual from scratch. Experimental results show that the proposed algorithm can significantly reduce the computational resources required and achieve competitive classification performance on the CIFAR dataset. Also, the architecture designed for the traffic sign recognition task exceeds the accuracy of manual expert design.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call