Abstract

This paper presents a general and comprehensive description of Optimization Methods, and Algorithms from a novel viewpoint. It is shown, in particular, that Direct Methods, Iterative Methods, and Computer Science Algorithms belong to a well-defined general class of both Finite and Infinite Procedures, characterized by suitable descent directions.

Highlights

  • The dichotomy between Computer Science and Numerical Analysis has been for many years the main obstacle to the development of eclectic computational tools

  • The main aim of the present paper is to show that Gradient or Gradient-type methods represent the fundamental computational tool to solve a wide set of continuous optimization problems, since they are based on a unitary principle, referred to both to finite and to infinite procedures

  • The following question arises: does QP characterize the boundary, separating finite continuous constrained optimization problems from infinite ones? In other words, there exist more general nonlinear constrained optimization problems that can be solved in a finite number of iterations? Since in the unconstrained case we have shown in the previous paragraph that there exist nonquadratic problems that can be exactly solved in a finite number of iterations by utilizing the CG -method, the answer is expected to be positive

Read more

Summary

Introduction

The dichotomy between Computer Science and Numerical Analysis has been for many years the main obstacle to the development of eclectic computational tools. The novel results on Tensor computation 13 are a promising area of research to improve the efficiency of global optimization algorithms for large-scale problems and for the effective construction of more general sets of Repeller matrices in the Tunneling phases 14, This approach can have important consequences in Nonlinear Integer Optimization see the pioneer work in , taking into account the more recent results concerning the discretization of the problem by the continuation methods see, e.g.,.

The Gradient and the Gradient-Type Approach
Local Unconstrained Optimization
Local Constrained Optimization
Global Optimization
Discrete Optimization
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call