Abstract

Black-Box Quantile Optimization via Finite-Difference-Based Gradient Approximation Risk management necessitates consideration of metrics such as quantiles to supplement conventional mean performance measures. In “Quantile Optimization via Multiple-Timescale Local Search for Black-Box Functions,” J. Hu, M. Song, and M. C. Fu consider the problem where the goal is to optimize the quantile of a black-box output. They introduce two new iterative multitimescale stochastic approximation algorithms utilizing finite-difference-based gradient estimators. The first algorithm requires 2d + 1 samples of the black-box function per iteration, where d is the number of decision variables. The second employs a simultaneous-perturbation-based gradient estimator that uses only three samples per iteration, irrespective of the number of decision variables. The authors prove strong local convergence for both algorithms and analyze their finite-time convergence rates through a novel fixed-point argument. These algorithms perform well across a varied set of benchmark problems.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.