Abstract

In this paper, we study a class of stochastic optimization problems, referred to as the conditional stochastic optimization (CSO), in the form of $\min_{x \in \mathcal{X}} \mathbb{E}_{\xi}f_\xi({\mathbb{E}_{\eta|\xi}[g_\eta(x, \xi)]})$, which finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust learning, and causal inference. Assuming availability of samples from the distribution $\mathbb{P}(\xi)$ and samples from the conditional distribution $\mathbb{P}(\eta|\xi)$, we establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from $\mathcal{O}(d/\epsilon^4)$ to $\mathcal{O}(d/\epsilon^3)$ when assuming smoothness of the outer function, and further to $\mathcal{O}(1/\epsilon^2)$ when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA when $\xi$ and $\eta$ are independent. Several numerical experiments further support our theoretical findings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call