Abstract

In classical inverse linear optimization, one assumes that a given solution is a candidate to be optimal. Real data are imperfect and noisy, so there is no guarantee that this assumption is satisfied. Inspired by regression, this paper presents a unified framework for cost function estimation in linear optimization comprising a general inverse optimization model and a corresponding goodness-of-fit metric. Although our inverse optimization model is nonconvex, we derive a closed-form solution and present the geometric intuition. Our goodness-of-fit metric, ρ, the coefficient of complementarity, has similar properties to R2 from regression and is quasi-convex in the input data, leading to an intuitive geometric interpretation. While ρ is computable in polynomial time, we derive a lower bound that possesses the same properties, is tight for several important model variations, and is even easier to compute. We demonstrate the application of our framework for model estimation and evaluation in production planning and cancer therapy. This paper was accepted by Yinyu Ye, optimization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call