This paper analyzes the computational complexity of checking whether a given feasible solution in a smooth, nonconvex nonlinear programming (NLP) problem is not a local minimum, and whether the objective function is not bounded below on the feasible set. It shows that these problems are NP-complete. The paper constructs a special instance of a nonconvex NLP, an indefinite quadratic program with integer data, and demonstrates that checking whether a feasible solution is not a local minimum or whether the objective function is not bounded below in this instance is NP-complete. As a corollary, it shows that checking whether a given integer square matrix is not copositive is also NP-complete.
The paper also discusses the difficulty of finding a global minimum in a smooth nonconvex NLP, providing two examples: Fermat's Last Conjecture and the subset-sum problem. These examples illustrate that finding a global minimum in a smooth nonconvex NLP is a hard problem.
The paper then examines whether it is possible to compute a local minimum efficiently in a smooth nonconvex NLP. It reviews necessary and sufficient conditions for a point to be a local minimum and shows that checking whether a given point is a local minimum is also NP-complete. This indicates that verifying whether a given feasible solution is a local minimum or whether the objective function is bounded below is computationally hard.
The paper concludes by discussing suitable goals for algorithms in nonconvex NLPs. It suggests that a suitable goal is to obtain a descent sequence converging to a KKT point, which is a feasible solution satisfying the first-order necessary conditions for a local minimum. It notes that some recently developed algorithms, such as sequential quadratic programming methods, aim to achieve this goal.This paper analyzes the computational complexity of checking whether a given feasible solution in a smooth, nonconvex nonlinear programming (NLP) problem is not a local minimum, and whether the objective function is not bounded below on the feasible set. It shows that these problems are NP-complete. The paper constructs a special instance of a nonconvex NLP, an indefinite quadratic program with integer data, and demonstrates that checking whether a feasible solution is not a local minimum or whether the objective function is not bounded below in this instance is NP-complete. As a corollary, it shows that checking whether a given integer square matrix is not copositive is also NP-complete.
The paper also discusses the difficulty of finding a global minimum in a smooth nonconvex NLP, providing two examples: Fermat's Last Conjecture and the subset-sum problem. These examples illustrate that finding a global minimum in a smooth nonconvex NLP is a hard problem.
The paper then examines whether it is possible to compute a local minimum efficiently in a smooth nonconvex NLP. It reviews necessary and sufficient conditions for a point to be a local minimum and shows that checking whether a given point is a local minimum is also NP-complete. This indicates that verifying whether a given feasible solution is a local minimum or whether the objective function is bounded below is computationally hard.
The paper concludes by discussing suitable goals for algorithms in nonconvex NLPs. It suggests that a suitable goal is to obtain a descent sequence converging to a KKT point, which is a feasible solution satisfying the first-order necessary conditions for a local minimum. It notes that some recently developed algorithms, such as sequential quadratic programming methods, aim to achieve this goal.