Convex underestimation techniques for nonlinear functions are an essential part of global optimization. These techniques usually involve the addition of new variables and constraints. In the case of posynomial functions \({x_1^{\alpha _1 } x_2^{\alpha _2 }\ldots x_n^{\alpha _n } ,}\) logarithmic transformations (Maranas and Floudas, Comput. Chem. Eng. 21:351–370, 1997) are typically used. This study develops an effective method for finding a tight relaxation of a posynomial function by introducing variables yj and positive parameters βj, for all αj > 0, such that \({y_j =x_j^{-\beta _j }}\) . By specifying βj carefully, we can find a tighter underestimation than the current methods.
Read full abstract