Abstract

In this paper, we propose a parameter training method in convolutional neural networks (CNNs). To introduce the orders and the cost function into the parameter optimization in CNNs, we give a definition of the Hausdorff-like derivative by analyzing the definition of the Hausdorff derivative, and a corresponding improved form is proposed. The improved Hausdorff-like (IHL) derivative is utilized in the parameter training of the back propagation in CNNs, and the orders are introduced in the back propagation for training of parameters. Therefore, we adjust the orders by judging the size of the cost function during the parameter training in CNNs, and the tuning of the orders can flexibly adjust the training speed of the parameters. An adaptive tuning approach for the orders is presented by analyzing the rule of the orders for the IHL derivative. By combining the proposed parameter training method with the adaptive moment estimation (Adam) algorithm, we propose the Adam algorithm with the IHL derivative. The experiments based on residual networks are carried out, and the experiments show that the proposed algorithm outperforms the original algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call