Some properties of a modification of the multidimensional Kiefer-Wolfowitz stochastic approximation algorithm are presented. First an iterative method of evaluating the Hessian matrix of a regression function is proposed. This method is then used in conjunction with the Kiefer-Wolfowitz procedure to obtain a stochastic analogue of the Newton-Raphson gradient search method. It is shown that this technique can be used to locate the minimum of a regression function, and it is also shown that under certain conditions accelerated convergence is obtained.
Read full abstract