Abstract

In this work, a new method is presented for determining the binding constraints of a general linear maximization problem. The new method uses only objective function values at points which are determined by simple vector operations, so the computational cost is inferior to the corresponding cost of matrix manipulation and/or inversion. This method uses a recently proposed notion for addressing such problems: the average of each constraint. The identification of binding constraints decreases the complexity and the dimension of the problem resulting to a significant decrease of the computational cost comparing to Simplex-like methods. The new method is highly useful when dealing with very large linear programming (LP) problems, where only a relatively small percentage of constraints are binding at the optimal solution, as in many transportation, management and economic problems, since it reduces the size of the problem. The method has been implemented and tested in a large number of LP problems. In LP problems without superfluous constraints, the algorithm was 100% successful in identifying binding constraints, while in a set of large scale LP tested problems that included superfluous constraints, the power of the algorithm considered as statistical tool of binding constraints identification, was up to 90.4%.

Highlights

  • It is well-known, that in large linear programming (LP) problems there is a significant number of redundant constraints and variables

  • The new method is highly useful when dealing with very large linear programming (LP) problems, where only a relatively small percentage of constraints are binding at the optimal solution, as in many transportation, management and economic problems, since it reduces the size of the problem

  • The algorithm was considered as a statistical tool for correctly identifying binding constraints in random linear programming problems and the results of this statistical approach are presented in the third part of the section

Read more

Summary

Introduction

It is well-known, that in large linear programming (LP) problems there is a significant number of redundant constraints and variables. This problem reduction, among others, results to less computational time and effort In this direction, many researchers including Andersen and Andersen [5], Balinsky [6], Boot [7], Brearly et al [8], Boneh et al [9], Caron et al [10], Ioslovich [11], Gal [12], Gutman and Isolovich [13], Mattheis [14], Nikolopoulou et al [15], Stojkovic and Staminirovic [16], Paulraj et al [17] [18] and Telgen [19] have proposed several algorithms to identify redundant constraints in order to reduce the dimension of the initial large scale LP problem.

A Notion about the Weighted Average
The New Method: A Geometrical Proof of Convergence
The Proposed Algorithm
Computational Cost
Numerical Results
Illustration of a LP Problem
Concluding Remarks and Further Research

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.