Abstract

In “Enhanced Balancing of Bias-Variance Tradeoff in Stochastic Estimation: A Minimax Perspective”, the authors study a framework to construct new classes of stochastic estimators that can consistently beat existing benchmarks regardless of key model parameter values. Oftentimes biased estimators, such as finite-difference estimators in black box stochastic gradient estimation, require selection of tuning parameters to balance bias and variance and ultimately minimize overall errors. Unfortunately, this relies on model knowledge that is unknown a priori and thus leads to ad hoc choices in practice. The authors introduce a new notion called asymptotic minimax risk ratio, which is designed to compare new estimators against existing benchmarks, whose values less than one imply that the new estimators could asymptotically outperform the benchmarks regardless of the model parameter value. Based on this, the authors study an outperforming weighting scheme by explicitly analyzing the asymptotic minimax risk ratio via a tractable reformulation of a nonconvex optimization problem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call