MINIMIZATION OF FUNCTIONS HAVING LIPSCHITZ CONTINUOUS FIRST PARTIAL DERIVATIVES

MINIMIZATION OF FUNCTIONS HAVING LIPSCHITZ CONTINUOUS FIRST PARTIAL DERIVATIVES

November 1966 | LARRY ARMijo
The paper by Larry Armijo discusses the minimization of functions with Lipschitz continuous first partial derivatives. It presents a general convergence theorem for the gradient method under specific hypotheses, which are detailed in the paper. The theorem shows that the usual steepest descent and modified steepest descent algorithms converge under these hypotheses. The modified algorithm allows for variable stepsize and does not require knowledge of the Lipschitz constant \( K \). The principal conditions for the function \( f \) are defined, including Conditions I through IV, which ensure the existence and uniqueness of the minimizer \( x^* \). Condition IV implies Conditions I and II, and if \( S(x_0) \) is bounded, it is equivalent to Conditions I and II. The convergence theorem states that if \( 0 < \delta \leq 1/4K \), then for any \( x \in S(x_0) \), the set \( S^*(x, \delta) \) is nonempty and any sequence \( \{x_k\} \) generated by the algorithm converges to \( x^* \). Two corollaries are provided: one for the steepest descent algorithm and another for the modified steepest descent algorithm, both of which converge to the minimizer \( x^* \). The paper concludes with a discussion comparing the results with those of previous studies, noting that the algorithms considered are more practical than those proposed by Curry and Goldstein. The author acknowledges the referee's comments and suggestions.The paper by Larry Armijo discusses the minimization of functions with Lipschitz continuous first partial derivatives. It presents a general convergence theorem for the gradient method under specific hypotheses, which are detailed in the paper. The theorem shows that the usual steepest descent and modified steepest descent algorithms converge under these hypotheses. The modified algorithm allows for variable stepsize and does not require knowledge of the Lipschitz constant \( K \). The principal conditions for the function \( f \) are defined, including Conditions I through IV, which ensure the existence and uniqueness of the minimizer \( x^* \). Condition IV implies Conditions I and II, and if \( S(x_0) \) is bounded, it is equivalent to Conditions I and II. The convergence theorem states that if \( 0 < \delta \leq 1/4K \), then for any \( x \in S(x_0) \), the set \( S^*(x, \delta) \) is nonempty and any sequence \( \{x_k\} \) generated by the algorithm converges to \( x^* \). Two corollaries are provided: one for the steepest descent algorithm and another for the modified steepest descent algorithm, both of which converge to the minimizer \( x^* \). The paper concludes with a discussion comparing the results with those of previous studies, noting that the algorithms considered are more practical than those proposed by Curry and Goldstein. The author acknowledges the referee's comments and suggestions.
Reach us at info@study.space