Abstract
In this paper, we propose a parameter training method in convolutional neural networks (CNNs). To introduce the orders and the cost function into the parameter optimization in CNNs, we give a definition of the Hausdorff-like derivative by analyzing the definition of the Hausdorff derivative, and a corresponding improved form is proposed. The improved Hausdorff-like (IHL) derivative is utilized in the parameter training of the back propagation in CNNs, and the orders are introduced in the back propagation for training of parameters. Therefore, we adjust the orders by judging the size of the cost function during the parameter training in CNNs, and the tuning of the orders can flexibly adjust the training speed of the parameters. An adaptive tuning approach for the orders is presented by analyzing the rule of the orders for the IHL derivative. By combining the proposed parameter training method with the adaptive moment estimation (Adam) algorithm, we propose the Adam algorithm with the IHL derivative. The experiments based on residual networks are carried out, and the experiments show that the proposed algorithm outperforms the original algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.