ON BAYESIAN METHODS FOR SEEKING THE EXTREMUM

ON BAYESIAN METHODS FOR SEEKING THE EXTREMUM

| J. Mockus
Bayesian methods for global optimization are based on minimizing the expected deviation from the extremum, assuming the function to be minimized is a realization of a stochastic function. The method involves a decision function that uses previous observations to determine the next observation point. The Bayesian method is defined by minimizing the expected deviation from the extremum, and it satisfies certain conditions for convergence. The method can handle noisy observations, where the observed value is the true function value plus a noise term. An illustrative example shows how the Bayesian method works for a quadratic function. The convergence of the Bayesian method depends on several conditions, including the compactness of the domain, continuity of the function, and the properties of the a priori distribution. A simplified version of the Bayesian method, called the one-stage method, assumes that each observation is the last one, and it converges to the minimum under the same conditions. The restricted-memory case limits the number of previous observations that can be used, which may affect convergence. The Bayesian method has been implemented for stochastic Gaussian fields and Wiener processes, and it has been applied to practical problems in signal estimation and experimental planning. The development of suitable a priori distributions for different function classes is crucial for the application of Bayesian methods in global optimization. References to key papers and authors are provided.Bayesian methods for global optimization are based on minimizing the expected deviation from the extremum, assuming the function to be minimized is a realization of a stochastic function. The method involves a decision function that uses previous observations to determine the next observation point. The Bayesian method is defined by minimizing the expected deviation from the extremum, and it satisfies certain conditions for convergence. The method can handle noisy observations, where the observed value is the true function value plus a noise term. An illustrative example shows how the Bayesian method works for a quadratic function. The convergence of the Bayesian method depends on several conditions, including the compactness of the domain, continuity of the function, and the properties of the a priori distribution. A simplified version of the Bayesian method, called the one-stage method, assumes that each observation is the last one, and it converges to the minimum under the same conditions. The restricted-memory case limits the number of previous observations that can be used, which may affect convergence. The Bayesian method has been implemented for stochastic Gaussian fields and Wiener processes, and it has been applied to practical problems in signal estimation and experimental planning. The development of suitable a priori distributions for different function classes is crucial for the application of Bayesian methods in global optimization. References to key papers and authors are provided.
Reach us at info@study.space
[slides] On Bayesian Methods for Seeking the Extremum | StudySpace