GLOBAL CONVERGENCE PROPERTIES OF CONJUGATE GRADIENT METHODS FOR OPTIMIZATION

GLOBAL CONVERGENCE PROPERTIES OF CONJUGATE GRADIENT METHODS FOR OPTIMIZATION

February 1992 | JEAN CHARLES GILBERT AND JORGE NOCEDAL
This paper explores the global convergence properties of nonlinear conjugate gradient methods without restarts and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. The Fletcher-Reeves method plays a key role in the first class, while the second class shares a significant property with the Polak-Ribière method. Numerical experiments are presented to support the theoretical findings. The paper begins by introducing the conjugate gradient method and its iteration form, emphasizing the importance of the descent direction and the angle between the gradient and the search direction. It then outlines the assumptions and line search strategies used in the analysis, including the Wolfe conditions and an ideal line search condition. The first section establishes global convergence for methods where the parameter $\beta_k$ is bounded by the Fletcher-Reeves formula. A modification of the Polak-Ribière formula is proposed, which ensures global convergence even with inexact line searches. The second section focuses on methods with nonnegative $\beta_k$ values, which share a property with the Polak-Ribière method. This property, referred to as Property (*), ensures that the method generates descent directions and has a bounded sequence of gradients. The paper concludes with a discussion on combining the intervals of admissible $\beta_k$ values from the two classes of methods, showing that global convergence cannot be guaranteed by simply combining these intervals. Numerical experiments are conducted to test the proposed algorithms on large-scale optimization problems,验证了所提出的算法在大规模优化问题上的有效性。This paper explores the global convergence properties of nonlinear conjugate gradient methods without restarts and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. The Fletcher-Reeves method plays a key role in the first class, while the second class shares a significant property with the Polak-Ribière method. Numerical experiments are presented to support the theoretical findings. The paper begins by introducing the conjugate gradient method and its iteration form, emphasizing the importance of the descent direction and the angle between the gradient and the search direction. It then outlines the assumptions and line search strategies used in the analysis, including the Wolfe conditions and an ideal line search condition. The first section establishes global convergence for methods where the parameter $\beta_k$ is bounded by the Fletcher-Reeves formula. A modification of the Polak-Ribière formula is proposed, which ensures global convergence even with inexact line searches. The second section focuses on methods with nonnegative $\beta_k$ values, which share a property with the Polak-Ribière method. This property, referred to as Property (*), ensures that the method generates descent directions and has a bounded sequence of gradients. The paper concludes with a discussion on combining the intervals of admissible $\beta_k$ values from the two classes of methods, showing that global convergence cannot be guaranteed by simply combining these intervals. Numerical experiments are conducted to test the proposed algorithms on large-scale optimization problems,验证了所提出的算法在大规模优化问题上的有效性。
Reach us at info@study.space