September 12, 2012 | Meisam Razaviyayn, Mingyi Hong and Zhi-Quan Luo
This paper studies a unified convergence analysis of block successive minimization methods for nonsmooth optimization. The block coordinate descent (BCD) method, widely used for minimizing continuous functions with multiple block variables, often requires exact solutions to subproblems, which can be restrictive in practical scenarios. To address this, the paper introduces an inexact BCD approach where variable blocks are updated by successively minimizing approximations of the objective function, either as locally tight upper bounds or strictly convex local approximations. The focus is on characterizing the convergence properties for a broad class of such methods, particularly for non-differentiable or nonconvex objective functions. The results unify and extend existing convergence analyses for classical algorithms like BCD, the difference of convex functions (DC) method, the expectation maximization (EM) algorithm, and the alternating proximal minimization algorithm. The paper also discusses the Block Successive Upper-bound Minimization (BSUM) algorithm, which updates a single block at each iteration, and the Maximum Improvement Successive Upper-bound Minimization (MISUM) algorithm, which updates the block providing the maximum improvement. Additionally, the Block Successive Convex Approximation (BSCA) algorithm is proposed, which uses strictly convex local approximations. The paper provides convergence guarantees for these algorithms under various conditions, including quasi-convexity and compactness assumptions. Finally, the paper applies these methods to linear transceiver design in cellular networks, demonstrating their effectiveness in solving practical optimization problems.This paper studies a unified convergence analysis of block successive minimization methods for nonsmooth optimization. The block coordinate descent (BCD) method, widely used for minimizing continuous functions with multiple block variables, often requires exact solutions to subproblems, which can be restrictive in practical scenarios. To address this, the paper introduces an inexact BCD approach where variable blocks are updated by successively minimizing approximations of the objective function, either as locally tight upper bounds or strictly convex local approximations. The focus is on characterizing the convergence properties for a broad class of such methods, particularly for non-differentiable or nonconvex objective functions. The results unify and extend existing convergence analyses for classical algorithms like BCD, the difference of convex functions (DC) method, the expectation maximization (EM) algorithm, and the alternating proximal minimization algorithm. The paper also discusses the Block Successive Upper-bound Minimization (BSUM) algorithm, which updates a single block at each iteration, and the Maximum Improvement Successive Upper-bound Minimization (MISUM) algorithm, which updates the block providing the maximum improvement. Additionally, the Block Successive Convex Approximation (BSCA) algorithm is proposed, which uses strictly convex local approximations. The paper provides convergence guarantees for these algorithms under various conditions, including quasi-convexity and compactness assumptions. Finally, the paper applies these methods to linear transceiver design in cellular networks, demonstrating their effectiveness in solving practical optimization problems.