This paper presents a study of the numerical performance of the limited memory BFGS (L-BFGS) method for large scale optimization. The L-BFGS method is compared with the Buckley and LeNir method, which combines BFGS steps and conjugate direction steps. The results show that the L-BFGS method is faster and better at using additional storage to accelerate convergence. A simple scaling significantly improves the performance of the L-BFGS method. The L-BFGS method is also compared with the partitioned quasi-Newton method of Griewank and Toint. While the partitioned quasi-Newton method is superior for some problems, the L-BFGS method is very competitive for others due to its low iteration cost. The paper also studies the convergence properties of the L-BFGS method and proves its global convergence on uniformly convex problems.
The paper discusses the use of limited memory quasi-Newton methods for large scale optimization, which are extensions of the conjugate gradient method that use additional storage to accelerate convergence. These methods are suitable for large scale problems because the storage required can be controlled by the user. Limited memory methods are simple to implement as they do not require knowledge of the sparsity structure of the Hessian or the separability of the objective function.
The paper presents extensive numerical tests of two limited memory methods and the partitioned quasi-Newton algorithm. The results indicate that the L-BFGS method is superior to the Buckley and LeNir method. The results also show that the partitioned quasi-Newton method is extremely effective for many problems. However, the L-BFGS method is very competitive with the partitioned quasi-Newton method in terms of CPU time for other problems. The paper also explores ways to improve the performance of the L-BFGS method by choosing suitable diagonal scalings and studies its behavior on very large problems. The paper also compares the L-BFGS method with two well-known conjugate gradient methods and the partitioned quasi-Newton method. The paper concludes with a convergence analysis of the L-BFGS method.This paper presents a study of the numerical performance of the limited memory BFGS (L-BFGS) method for large scale optimization. The L-BFGS method is compared with the Buckley and LeNir method, which combines BFGS steps and conjugate direction steps. The results show that the L-BFGS method is faster and better at using additional storage to accelerate convergence. A simple scaling significantly improves the performance of the L-BFGS method. The L-BFGS method is also compared with the partitioned quasi-Newton method of Griewank and Toint. While the partitioned quasi-Newton method is superior for some problems, the L-BFGS method is very competitive for others due to its low iteration cost. The paper also studies the convergence properties of the L-BFGS method and proves its global convergence on uniformly convex problems.
The paper discusses the use of limited memory quasi-Newton methods for large scale optimization, which are extensions of the conjugate gradient method that use additional storage to accelerate convergence. These methods are suitable for large scale problems because the storage required can be controlled by the user. Limited memory methods are simple to implement as they do not require knowledge of the sparsity structure of the Hessian or the separability of the objective function.
The paper presents extensive numerical tests of two limited memory methods and the partitioned quasi-Newton algorithm. The results indicate that the L-BFGS method is superior to the Buckley and LeNir method. The results also show that the partitioned quasi-Newton method is extremely effective for many problems. However, the L-BFGS method is very competitive with the partitioned quasi-Newton method in terms of CPU time for other problems. The paper also explores ways to improve the performance of the L-BFGS method by choosing suitable diagonal scalings and studies its behavior on very large problems. The paper also compares the L-BFGS method with two well-known conjugate gradient methods and the partitioned quasi-Newton method. The paper concludes with a convergence analysis of the L-BFGS method.