Abstract
We propose a method for solving quantile optimization problems with a loss function that depends on a vector of small random parameters. This method is based on using a model linearized with respect to the random vector instead of the original nonlinear loss function. We show that in first approximation, the quantile optimization problem reduces to a minimax problem where the uncertainty set is a kernel of a probability measure.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have