Abstract

Deep belief networks (DBNs) of deep learning technology have been successfully used in many fields. However, the structure of a DBN is difficult to design for different datasets. Hence, a DBN structure design algorithm based on information entropy and reconstruction error is proposed. Unlike previous algorithms, we innovatively combine network depth and node number and optimizes them simultaneously. First, the mathematical model of the structural design problem is established, and the boundary constraint for node number based on information entropy is derived by introducing the idea of information compression. Moreover, the optimization objective of the network performance based on reconstruction error is proposed by deriving the fact that network energy is proportional to reconstruction error. Finally, the improved simulated annealing (ISA) algorithm is used to adjust the DBN network layers and nodes simultaneously. Experiments were carried out on three public datasets (MNIST, Cifar-10 and Cifar-100). The results show that the proposed algorithm can design its proper structure to different datasets, yielding a trained DBN which has the lowest reconstruction error and prediction error rate. The proposed algorithm is shown to have the best performance compared with other algorithms and can be used to assist the setting of DBN structural parameters for different datasets.

Highlights

  • A deep belief network (DBN) is a kind of deep artificial neural network (ANN) [1]

  • The internal energy of the solution in the improved simulated annealing (ISA) algorithm is equal to the reconstruction error of the restricted Boltzmann machine (RBM) at the highest level of the DBN

  • We refer to the proposed algorithm as the information entropy and reconstruction error via ISA (IEREISA) method

Read more

Summary

Introduction

A deep belief network (DBN) is a kind of deep artificial neural network (ANN) [1]. An ANN, which originated from Rosenblatt’s perceptron model, is an information processing network composed of simple nodes that has nonlinear fitting ability [2]. Previous studies have preliminarily discussed the design method for a DBN structure, but they have only discussed a single aspect of structure, either network depth or the number of nodes. They did not fully consider the unsupervised training process of the DBN network. To improve the performance of DBN by changing its structure, we need a DBN structure design algorithm that simultaneously and organically combines network depth and node number. The algorithm innovatively combines the network depth and number of nodes into a unified mathematical model, introduces information entropy and reconstruction error, and uses the ISA algorithm to solve the optimization problem. The constructed DBN has lower reconstruction and root-mean-square errors in training process as well as a low prediction error rate in test process

Structure Optimization Model of a DBN
Conclusion
DBN Performance Measurement Based on Reconstruction Error
Structure Design Using ISA
Experiments and Results Analysis
Cifar-10 Dataset Classification Experiment
Reconstruction Error for Unsupervised Training
The times
Algorithm
Reconstruction in Unsupervised
Prediction Error Rate and Time Complexity
ISA Algorithm Analysis
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call