This paper presents new gradient methods for minimizing composite functions, where the objective function is the sum of a smooth function given by a black-box oracle and a simple convex function with known structure. For convex problems, the paper analyzes primal and dual variants of the gradient method with convergence rate $ O(1/k) $, and an accelerated multistep method with convergence rate $ O(1/k^2) $, where $ k $ is the iteration counter. For nonconvex problems, it proves convergence to a point with no descent direction. It also shows that determining the existence of a descent direction for general nonsmooth, nonconvex problems is NP-hard. The paper suggests efficient line search procedures and shows that estimating unknown parameters only multiplies the complexity of each iteration by a small constant. Preliminary computational experiments confirm the superiority of the accelerated scheme. The paper discusses the importance of exploiting problem structure to develop efficient optimization methods, which can significantly overcome the limitations of black-box complexity theory. Examples include constrained minimization and barrier representation of feasible sets. The paper highlights the role of structural optimization in improving the efficiency of optimization methods. Keywords: Local optimization, Convex optimization, Nonsmooth optimization, Complexity theory, Black-box model, Optimal methods, Structural optimization $ l_1 $-Regularization. Mathematics Subject Classification: 90C25, 90C47, 68Q25.This paper presents new gradient methods for minimizing composite functions, where the objective function is the sum of a smooth function given by a black-box oracle and a simple convex function with known structure. For convex problems, the paper analyzes primal and dual variants of the gradient method with convergence rate $ O(1/k) $, and an accelerated multistep method with convergence rate $ O(1/k^2) $, where $ k $ is the iteration counter. For nonconvex problems, it proves convergence to a point with no descent direction. It also shows that determining the existence of a descent direction for general nonsmooth, nonconvex problems is NP-hard. The paper suggests efficient line search procedures and shows that estimating unknown parameters only multiplies the complexity of each iteration by a small constant. Preliminary computational experiments confirm the superiority of the accelerated scheme. The paper discusses the importance of exploiting problem structure to develop efficient optimization methods, which can significantly overcome the limitations of black-box complexity theory. Examples include constrained minimization and barrier representation of feasible sets. The paper highlights the role of structural optimization in improving the efficiency of optimization methods. Keywords: Local optimization, Convex optimization, Nonsmooth optimization, Complexity theory, Black-box model, Optimal methods, Structural optimization $ l_1 $-Regularization. Mathematics Subject Classification: 90C25, 90C47, 68Q25.