Abstract

The q-gradient method used a Yuan step size for odd steps, and geometric recursion as an even step size (q-GY). This study aimed to accelerate convergence to a minimum point by minimizing the number of iterations, by dilating the parameter q to the independent variable and then comparing the results with three algorithms namely, the classical steepest descent (SD) method, steepest descent method with Yuan Steps (SDY), and q-gradient method with geometric recursion (q-G). The numerical results were presented in tables and graphs. The study used Rosenbrock function f(x)=〖(1-x_1)〗^2+100〖(x_2-〖x_1〗^2)〗^2 and determined μ=1,σ_0=0.5,β=0.999, the starting point (x_0) with a uniform distribution on the interval x_0= (-2.048, 2.048) in R^2, with 49 starting points (x_0) executed using the Python online compiler on a 64bit core i3 laptop. The maximum number of iterations was 58,679. Using tolerance limits as stopping criteria is 10-4 and the inequality 〖f(x〗^*)>f to get numerical results. q-GY method down ward movement towards the minimum point was better than the SD and SDY methods while the numerical results of the Rosenbrock function showed good enough performance to increase convergence to the minimum point

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call