Submitted October 27, 2008, Revised May 7, 2009 | Shiqian Ma · Donald Goldfarb · Lifeng Chen
This paper addresses the linearly constrained matrix rank minimization problem, which is widely used in various fields such as control, signal processing, and system identification. The tightest convex relaxation of this problem is the linearly constrained nuclear norm minimization. While semidefinite programming (SDP) can solve this problem, it is computationally expensive for large matrices. The authors propose fixed point and Bregman iterative algorithms to solve the nuclear norm minimization problem and prove the convergence of the fixed point algorithm. By combining a homotopy approach with an approximate singular value decomposition (SVD) procedure, they develop a fast, robust, and powerful algorithm called FPCA (Fixed Point Continuation with Approximate SVD). Numerical results on randomly generated and real matrix completion problems demonstrate that FPCA is significantly faster and provides better recoverability compared to SDPT3, a state-of-the-art SDP solver. For example, FPCA can recover 1000 × 1000 matrices of rank 50 with a relative error of 10^-5 in about 3 minutes by sampling only 20% of the elements. The effectiveness of the algorithms is further demonstrated through applications in online recommendation, DNA microarray data sets, and image inpainting.This paper addresses the linearly constrained matrix rank minimization problem, which is widely used in various fields such as control, signal processing, and system identification. The tightest convex relaxation of this problem is the linearly constrained nuclear norm minimization. While semidefinite programming (SDP) can solve this problem, it is computationally expensive for large matrices. The authors propose fixed point and Bregman iterative algorithms to solve the nuclear norm minimization problem and prove the convergence of the fixed point algorithm. By combining a homotopy approach with an approximate singular value decomposition (SVD) procedure, they develop a fast, robust, and powerful algorithm called FPCA (Fixed Point Continuation with Approximate SVD). Numerical results on randomly generated and real matrix completion problems demonstrate that FPCA is significantly faster and provides better recoverability compared to SDPT3, a state-of-the-art SDP solver. For example, FPCA can recover 1000 × 1000 matrices of rank 50 with a relative error of 10^-5 in about 3 minutes by sampling only 20% of the elements. The effectiveness of the algorithms is further demonstrated through applications in online recommendation, DNA microarray data sets, and image inpainting.