This paper presents a stochastic approximation (SA) algorithm for finding the root of a multivariate gradient equation, which arises in function minimization. The algorithm uses a "simultaneous perturbation" gradient approximation instead of the standard finite difference method. The key idea is to use a single vector of perturbations to estimate the gradient, which reduces the number of noisy measurements required compared to the standard finite difference approach. This results in a more efficient algorithm, especially for high-dimensional problems.
The algorithm is based on the idea of using a vector of independent random perturbations to estimate the gradient. This approach requires significantly fewer measurements than the standard finite difference method, which typically requires 2p measurements per iteration. The simultaneous perturbation method requires only 2q measurements per iteration, where q is much smaller than p for large p. This leads to a significant improvement in efficiency, as the number of iterations needed to achieve a certain level of accuracy does not increase to offset the reduced number of measurements per iteration.
The paper provides theoretical justification for the algorithm, including results on strong convergence and asymptotic normality. It also compares the performance of the simultaneous perturbation SA (SPSA) algorithm with the multivariate Kiefer-Wolfowitz finite difference SA (FDSA) algorithm. The results show that SPSA can achieve the same level of estimation accuracy with significantly fewer data points, making it more efficient for high-dimensional problems.
Numerical experiments are conducted to evaluate the performance of the SPSA and FDSA algorithms on a large-dimensional problem. The results show that SPSA outperforms FDSA in terms of efficiency, requiring significantly fewer measurements to achieve the same level of accuracy. The study also compares the performance of SPSA with other algorithms, such as the random directions method, and shows that SPSA is more efficient in terms of both computational cost and accuracy. The results indicate that SPSA is a promising approach for high-dimensional optimization problems.This paper presents a stochastic approximation (SA) algorithm for finding the root of a multivariate gradient equation, which arises in function minimization. The algorithm uses a "simultaneous perturbation" gradient approximation instead of the standard finite difference method. The key idea is to use a single vector of perturbations to estimate the gradient, which reduces the number of noisy measurements required compared to the standard finite difference approach. This results in a more efficient algorithm, especially for high-dimensional problems.
The algorithm is based on the idea of using a vector of independent random perturbations to estimate the gradient. This approach requires significantly fewer measurements than the standard finite difference method, which typically requires 2p measurements per iteration. The simultaneous perturbation method requires only 2q measurements per iteration, where q is much smaller than p for large p. This leads to a significant improvement in efficiency, as the number of iterations needed to achieve a certain level of accuracy does not increase to offset the reduced number of measurements per iteration.
The paper provides theoretical justification for the algorithm, including results on strong convergence and asymptotic normality. It also compares the performance of the simultaneous perturbation SA (SPSA) algorithm with the multivariate Kiefer-Wolfowitz finite difference SA (FDSA) algorithm. The results show that SPSA can achieve the same level of estimation accuracy with significantly fewer data points, making it more efficient for high-dimensional problems.
Numerical experiments are conducted to evaluate the performance of the SPSA and FDSA algorithms on a large-dimensional problem. The results show that SPSA outperforms FDSA in terms of efficiency, requiring significantly fewer measurements to achieve the same level of accuracy. The study also compares the performance of SPSA with other algorithms, such as the random directions method, and shows that SPSA is more efficient in terms of both computational cost and accuracy. The results indicate that SPSA is a promising approach for high-dimensional optimization problems.