November 10, 2009 | Lionel Barnett*, Adam B. Barrett† and Anil K. Seth†
Granger causality and transfer entropy are equivalent for Gaussian variables. Granger causality is a statistical measure of causal influence based on prediction via vector autoregression, while transfer entropy is an information-theoretic measure of directed information transfer between jointly dependent processes. The authors show that for Gaussian variables, these two concepts are entirely equivalent, bridging autoregressive and information-theoretic approaches to data-driven causal inference.
Granger causality is defined by the extent to which one variable assists in predicting another. Transfer entropy, on the other hand, measures the degree to which one variable disambiguates the future of another. The authors demonstrate that under Gaussian assumptions, these two measures are mathematically equivalent. This equivalence is derived from the properties of multivariate Gaussian distributions, where the conditional entropy and partial covariance matrix play key roles.
The paper also discusses the implications of this equivalence for causal inference. It shows that for Gaussian processes, Granger causality and transfer entropy are equivalent up to a factor of 2. This result has practical significance, as it provides a unified framework for data-driven causal inference that combines information-theoretic and autoregressive methods. It also highlights the importance of Gaussian assumptions in neuroscience and econometrics, where they are often used as analytical benchmarks.
The authors note that while the equivalence holds under Gaussian assumptions, the relationship between Granger causality and transfer entropy may break down in non-Gaussian cases. They also discuss the challenges of estimating transfer entropy from sampled data and the need for more sophisticated techniques in empirical applications. Overall, the paper provides a foundational understanding of the relationship between Granger causality and transfer entropy for Gaussian variables, with important implications for causal inference in various fields.Granger causality and transfer entropy are equivalent for Gaussian variables. Granger causality is a statistical measure of causal influence based on prediction via vector autoregression, while transfer entropy is an information-theoretic measure of directed information transfer between jointly dependent processes. The authors show that for Gaussian variables, these two concepts are entirely equivalent, bridging autoregressive and information-theoretic approaches to data-driven causal inference.
Granger causality is defined by the extent to which one variable assists in predicting another. Transfer entropy, on the other hand, measures the degree to which one variable disambiguates the future of another. The authors demonstrate that under Gaussian assumptions, these two measures are mathematically equivalent. This equivalence is derived from the properties of multivariate Gaussian distributions, where the conditional entropy and partial covariance matrix play key roles.
The paper also discusses the implications of this equivalence for causal inference. It shows that for Gaussian processes, Granger causality and transfer entropy are equivalent up to a factor of 2. This result has practical significance, as it provides a unified framework for data-driven causal inference that combines information-theoretic and autoregressive methods. It also highlights the importance of Gaussian assumptions in neuroscience and econometrics, where they are often used as analytical benchmarks.
The authors note that while the equivalence holds under Gaussian assumptions, the relationship between Granger causality and transfer entropy may break down in non-Gaussian cases. They also discuss the challenges of estimating transfer entropy from sampled data and the need for more sophisticated techniques in empirical applications. Overall, the paper provides a foundational understanding of the relationship between Granger causality and transfer entropy for Gaussian variables, with important implications for causal inference in various fields.