Mutual Information and Minimum Mean-square Error in Gaussian Channels

Mutual Information and Minimum Mean-square Error in Gaussian Channels

23 Dec 2004 | Dongning Guo, Shlomo Shamai (Shitz), and Sergio Verdú
This paper explores the relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels. It introduces a new formula that connects the derivative of mutual information with respect to the signal-to-noise ratio (SNR) to half of the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. The paper also reveals an unexpected consequence in continuous-time nonlinear estimation: the causal filtering MMSE at any SNR is equal to the average value of the noncausal smoothing MMSE, where the SNR is uniformly distributed between 0 and the given SNR. The results are derived using an incremental channel approach and geometric properties of likelihood ratios, providing insights into the connection between information theory and estimation theory.This paper explores the relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels. It introduces a new formula that connects the derivative of mutual information with respect to the signal-to-noise ratio (SNR) to half of the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. The paper also reveals an unexpected consequence in continuous-time nonlinear estimation: the causal filtering MMSE at any SNR is equal to the average value of the noncausal smoothing MMSE, where the SNR is uniformly distributed between 0 and the given SNR. The results are derived using an incremental channel approach and geometric properties of likelihood ratios, providing insights into the connection between information theory and estimation theory.
Reach us at info@study.space
Understanding Mutual information and minimum mean-square error in Gaussian channels