Data-driven distributionally robust chance-constrained optimization (DRCCP) is a powerful technique to handle optimization problems involving uncertainty in constraint functions. However, the outliers and extreme samples in the data set may deteriorate the decision quality of DRCCP. Although there are numerous outlier detection techniques, they are either unable to pinpoint the samples causing overly conservative solutions, or incompatible with DRCCP models. This work proposes a novel and widely compatible algorithm that generates a representative subset of the original data set and removes samples causing overly conservative solutions for the DRCCP problem. With the proposed approach, the DRCCP solution quality can be enhanced while simultaneously ensuring the solution feasibility. To illustrate its effectiveness, we examine two numerical examples and a nonlinear process optimization problem in our study.
Read full abstract