This paper presents a new approach for efficiently solving non-smooth convex optimization problems. The method involves smoothing non-differentiable functions to create smooth approximations, which can then be minimized using gradient-based techniques. The key idea is to transform the original non-smooth problem into a smooth one by introducing a smoothing parameter, allowing for faster convergence. The approach improves upon traditional methods by reducing the iteration complexity from $ O(1/\epsilon^2) $ to $ O(1/\epsilon) $, while maintaining the efficiency of each iteration. The paper discusses the theoretical foundations of the smoothing technique, presents an optimal gradient method for smooth optimization, and applies the method to various problem instances, including matrix games, continuous location problems, variational inequalities, and piecewise linear optimization. The results show that the proposed method achieves better performance than traditional subgradient methods, particularly in terms of iteration complexity. The paper also addresses implementation issues, including computational stability and the use of prox-functions to handle different problem structures. Overall, the method provides a more efficient way to solve non-smooth convex optimization problems by leveraging smoothing techniques and optimal gradient methods.This paper presents a new approach for efficiently solving non-smooth convex optimization problems. The method involves smoothing non-differentiable functions to create smooth approximations, which can then be minimized using gradient-based techniques. The key idea is to transform the original non-smooth problem into a smooth one by introducing a smoothing parameter, allowing for faster convergence. The approach improves upon traditional methods by reducing the iteration complexity from $ O(1/\epsilon^2) $ to $ O(1/\epsilon) $, while maintaining the efficiency of each iteration. The paper discusses the theoretical foundations of the smoothing technique, presents an optimal gradient method for smooth optimization, and applies the method to various problem instances, including matrix games, continuous location problems, variational inequalities, and piecewise linear optimization. The results show that the proposed method achieves better performance than traditional subgradient methods, particularly in terms of iteration complexity. The paper also addresses implementation issues, including computational stability and the use of prox-functions to handle different problem structures. Overall, the method provides a more efficient way to solve non-smooth convex optimization problems by leveraging smoothing techniques and optimal gradient methods.