Objective:The Barzilai–Borwein(BB) method is essential in solving unconstrained optimization problems. The momentum method accelerates optimization algorithms with exponentially weighted moving average. In order to design reliable deep learning optimization algorithms, this paper proposes applying the BB method in four variants to the optimization algorithm of deep learning. Findings:The momentum method generates the BB step size under different step range limits. We also apply the momentum method and its variants to the stochastic gradient descent with the BB step size. Novelty:The algorithm’s robustness has been demonstrated through experiments on the initial learning rate and random seeds. The algorithm’s sensitivity is tested by choosing different momentum factors until a suitable momentum factor is found. Moreover, we compare our algorithms with popular algorithms in various neural networks. The results show that the new algorithms improve the efficiency of the BB step size in deep learning and provide a variety of optimization algorithm choices.
Read full abstract