Abstract

In past decades, Bayesian neural networks have attracted much attention due to their advantages of being less prone to over-fitting and being able to generate uncertain measurements with discriminant results. However, Compared with traditional neural networks, Bayesian neural network has too many hyper-parameters to be optimized, so that its performance in classification or regression problems on large-scale datasets is not much superior to ordinary neural networks. Therefore, in order to design a Bayesian network with superior performance, we propose VIBCNN-EvoNAS, a Bayesian convolutional neural network architecture search framework based on variational inference, which constructs a search space through a fixed length integer encoding scheme, and uses evolutionary algorithm as a search strategy to deeply explore the influence of convolution kernel size and other related parameters on the network architecture. In addition, in order to reduce the time loss caused by individual evaluation, we adopt the early stop mechanism in the performance evaluation stage. The proposed method is evaluated on CIFAR10 and CIFAR100 datasets, and the experimental results show the effectiveness of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call