21 Jul 2009 | David L. Donoho, Arian Maleki, Andrea Montanari
This paper introduces a fast iterative algorithm, AMP (Approximate Message Passing), for compressed sensing. Compressed sensing aims to reconstruct high-dimensional signals from a small number of measurements by exploiting their sparsity. Convex optimization methods, such as linear programming, achieve good sparsity-undersampling tradeoffs but are computationally expensive. The authors propose a modified iterative thresholding algorithm inspired by belief propagation in graphical models, which achieves the same sparsity-undersampling tradeoff as convex optimization but runs significantly faster.
The AMP algorithm is shown to match the theoretical performance of convex optimization methods in reconstructing sparse signals. Empirical results confirm that the sparsity-undersampling tradeoff predicted by the state evolution formalism aligns with both theoretical calculations and numerical simulations. The state evolution formalism accurately models the mean squared error (MSE) evolution of the AMP algorithm, showing that it converges to zero in the correct region of sparsity and undersampling parameters.
The paper also demonstrates that the phase transition curves for different signal models (non-negative, signed, and bounded) are identical when compared to those derived from convex optimization. This suggests that the AMP algorithm performs as well as convex optimization in these scenarios. The authors further show that the phase transition curves derived from the state evolution formalism match those from combinatorial geometry, indicating the robustness of the results.
The AMP algorithm is shown to be effective in large-scale applications, such as imaging and spectroscopy, where traditional convex optimization methods are too slow. The algorithm's performance is validated through extensive simulations and experiments, demonstrating its ability to accurately reconstruct sparse signals under various conditions. The results highlight the potential of AMP as a fast and effective alternative to convex optimization in compressed sensing.This paper introduces a fast iterative algorithm, AMP (Approximate Message Passing), for compressed sensing. Compressed sensing aims to reconstruct high-dimensional signals from a small number of measurements by exploiting their sparsity. Convex optimization methods, such as linear programming, achieve good sparsity-undersampling tradeoffs but are computationally expensive. The authors propose a modified iterative thresholding algorithm inspired by belief propagation in graphical models, which achieves the same sparsity-undersampling tradeoff as convex optimization but runs significantly faster.
The AMP algorithm is shown to match the theoretical performance of convex optimization methods in reconstructing sparse signals. Empirical results confirm that the sparsity-undersampling tradeoff predicted by the state evolution formalism aligns with both theoretical calculations and numerical simulations. The state evolution formalism accurately models the mean squared error (MSE) evolution of the AMP algorithm, showing that it converges to zero in the correct region of sparsity and undersampling parameters.
The paper also demonstrates that the phase transition curves for different signal models (non-negative, signed, and bounded) are identical when compared to those derived from convex optimization. This suggests that the AMP algorithm performs as well as convex optimization in these scenarios. The authors further show that the phase transition curves derived from the state evolution formalism match those from combinatorial geometry, indicating the robustness of the results.
The AMP algorithm is shown to be effective in large-scale applications, such as imaging and spectroscopy, where traditional convex optimization methods are too slow. The algorithm's performance is validated through extensive simulations and experiments, demonstrating its ability to accurately reconstruct sparse signals under various conditions. The results highlight the potential of AMP as a fast and effective alternative to convex optimization in compressed sensing.