Abstract
Abstract An accelerated of the steepest descent method for solving unconstrained optimization problems is presented. which propose a fundamentally different conjugate gradient method, in which the well-known parameter βk is computed by an new formula. Under common assumptions, by using a modified Wolfe line search, descent property and global convergence results were established for the new method. Experimental results provide evidence that our proposed method is in general superior to the classical steepest descent method and has a potential to significantly enhance the computational efficiency and robustness of the training process.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.