FEB 28 1996 | Richard H. Byrd, Peihuang Lu, and Jorge Nocedal
This paper presents a limited-memory quasi-Newton algorithm for solving large nonlinear optimization problems with simple bounds on the variables. The algorithm is based on the gradient projection method and uses a limited-memory BFGS matrix to approximate the Hessian of the objective function. It efficiently handles the bounds by using a gradient projection approach to determine active constraints at each iteration and a limited-memory BFGS matrix to approximate the Hessian. The algorithm's computational cost is linear in the number of variables, making it suitable for large-scale optimization problems. The algorithm is compared with other methods, including the LANCELOT subroutine, and is shown to be efficient in most cases, particularly when the number of active bounds is small. The algorithm is implemented in double-precision Fortran 77 and is tested on a variety of problems, including bound-constrained quadratic optimization problems and nonlinear problems. The results indicate that the dual method for subspace minimization is the fastest in most cases, while the direct primal method is often as fast as the conjugate gradient method. The algorithm is capable of handling bounds and has a low computational cost per iteration, making it suitable for problems with large, unstructured, dense Hessian matrices. However, it is less competitive when an exact Hessian is available or when sparsity can be exploited. The algorithm is described in detail, including the computation of the generalized Cauchy point and the minimization of the quadratic model over the subspace of free variables. The paper also discusses the use of limited-memory BFGS matrices and their representation in compact form to reduce computational costs. The algorithm is shown to be effective in solving a variety of problems, including those with tight bounds and those with a significant degree of partial separability.This paper presents a limited-memory quasi-Newton algorithm for solving large nonlinear optimization problems with simple bounds on the variables. The algorithm is based on the gradient projection method and uses a limited-memory BFGS matrix to approximate the Hessian of the objective function. It efficiently handles the bounds by using a gradient projection approach to determine active constraints at each iteration and a limited-memory BFGS matrix to approximate the Hessian. The algorithm's computational cost is linear in the number of variables, making it suitable for large-scale optimization problems. The algorithm is compared with other methods, including the LANCELOT subroutine, and is shown to be efficient in most cases, particularly when the number of active bounds is small. The algorithm is implemented in double-precision Fortran 77 and is tested on a variety of problems, including bound-constrained quadratic optimization problems and nonlinear problems. The results indicate that the dual method for subspace minimization is the fastest in most cases, while the direct primal method is often as fast as the conjugate gradient method. The algorithm is capable of handling bounds and has a low computational cost per iteration, making it suitable for problems with large, unstructured, dense Hessian matrices. However, it is less competitive when an exact Hessian is available or when sparsity can be exploited. The algorithm is described in detail, including the computation of the generalized Cauchy point and the minimization of the quadratic model over the subspace of free variables. The paper also discusses the use of limited-memory BFGS matrices and their representation in compact form to reduce computational costs. The algorithm is shown to be effective in solving a variety of problems, including those with tight bounds and those with a significant degree of partial separability.