Mutual Information and Minimum Mean-square Error in Gaussian Channels

Mutual Information and Minimum Mean-square Error in Gaussian Channels

23 Dec 2004 | Dongning Guo, Shlomo Shamai (Shitz), and Sergio Verdú
This paper presents a fundamental relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels. It shows that the derivative of mutual information with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. The result has unexpected implications in continuous-time nonlinear estimation, where the causal filtering MMSE at a given SNR is equal to the average of the noncausal smoothing MMSE over a channel with SNR uniformly distributed between 0 and the given SNR. The paper provides proofs for this relationship using an "incremental channel" approach, which considers the effect of an infinitesimal increase in noise. The result is also shown to hold for vector channels and is connected to the geometric properties of the likelihood ratio in Gaussian channels. The paper also discusses applications of the result in information theory, estimation theory, and signal processing, including the analysis of multiuser channels and the derivation of de Bruijn's identity. The relationship between mutual information and MMSE is further extended to continuous-time models, where the derivative of mutual information rate is equal to half the noncausal MMSE. The paper concludes with a discussion of the asymptotic behavior of mutual information and MMSE at low and high SNRs, showing that higher-order moments have no impact on mutual information to the third order of the SNR.This paper presents a fundamental relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels. It shows that the derivative of mutual information with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. The result has unexpected implications in continuous-time nonlinear estimation, where the causal filtering MMSE at a given SNR is equal to the average of the noncausal smoothing MMSE over a channel with SNR uniformly distributed between 0 and the given SNR. The paper provides proofs for this relationship using an "incremental channel" approach, which considers the effect of an infinitesimal increase in noise. The result is also shown to hold for vector channels and is connected to the geometric properties of the likelihood ratio in Gaussian channels. The paper also discusses applications of the result in information theory, estimation theory, and signal processing, including the analysis of multiuser channels and the derivation of de Bruijn's identity. The relationship between mutual information and MMSE is further extended to continuous-time models, where the derivative of mutual information rate is equal to half the noncausal MMSE. The paper concludes with a discussion of the asymptotic behavior of mutual information and MMSE at low and high SNRs, showing that higher-order moments have no impact on mutual information to the third order of the SNR.
Reach us at info@study.space
Understanding Mutual information and minimum mean-square error in Gaussian channels