Abstract

The problem of unconstrained optimization (UOP) has recently gained a great deal of attention from researchers around the globe due to its numerous real-life applications. The conjugate gradient (CG) method is among the most widely used algorithms for solving UOP because of its good convergence properties and low memory requirements. This study investigates the performance of a modified CG coefficient for optimization functions, proof of sufficient descent, and global convergence of the new CG method under suitable, standard Wolfe conditions. Computational results on several benchmark problems are presented to validate the robustness and efficacy of the new algorithm. The proposed method was also applied to solve function estimations in inverse heat transfer problems. Another interesting feature possessed by the proposed modification is the ability to solve problems on a large scale and use different dimensions. Based on the theoretical and computational efficiency of the new method, we can conclude that the new coefficient can be a better alternative for solving unconstrained optimization and real-life application problems. KEYWORDS Computational efficiency, global convergence, inverse heat, low memory, optimization problems, theoretical efficiency

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.