Abstract

In this paper, a new adjustment to the damping parameter of the Levenberg-Marquardt algorithm is proposed to save training time and to reduce error oscillations. The damping parameter of the Levenberg-Marquardt algorithm switches between a gradient descent method and the Gauss-Newton method. It also afiects training speed and induces error oscillations when a decay rate is flxed. Therefore, our damping strategy decreases the damping parameter with the inner product between weight vectors to make the Levenberg-Marquardt algorithm be- have more like the Gauss-Newton method, and it increases the damping parameter with a diagonally dominant matrix to make the Levenberg-Marquardt algorithm act like a gradient descent method. We tested two simple classiflcations and a handwritten digit recognition for this work. Simulations showed that our method improved training speed and error oscillations were fewer than those of other algo- rithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call