Measuring Information Transfer

Measuring Information Transfer

(February 7, 2008) | Thomas Schreiber
This paper introduces a new information theoretic measure called *transfer entropy* to quantify the statistical coherence between systems evolving over time. Unlike the standard time-delayed mutual information, which fails to distinguish between information exchange and shared information due to common history or input signals, transfer entropy explicitly excludes these influences by conditioning on transition probabilities. This allows it to detect the directed exchange of information and asymmetry in the coupling of subsystems. The author motivates the development of transfer entropy by discussing the limitations of mutual information, which does not contain dynamical or directional information. Transfer entropy is derived from the concept of entropy rates and conditional probabilities, generalizing the Shannon entropy to multiple systems. It is non-symmetric, reflecting the directionality of information flow, and can be computed using kernel estimation for time series data. The paper demonstrates the application of transfer entropy through three examples: a one-dimensional lattice of unidirectionally coupled maps, the Ulam map with non-small coupling, and a bi-variate physiological time series of a sleeping human's breath rate and heart rate. In these examples, transfer entropy accurately captures the directed information flow, distinguishing it from static correlations due to common history or input signals. The author concludes that transfer entropy is a powerful tool for multivariate analysis of time series and the study of spatially extended systems, offering a more nuanced understanding of information dynamics compared to traditional methods.This paper introduces a new information theoretic measure called *transfer entropy* to quantify the statistical coherence between systems evolving over time. Unlike the standard time-delayed mutual information, which fails to distinguish between information exchange and shared information due to common history or input signals, transfer entropy explicitly excludes these influences by conditioning on transition probabilities. This allows it to detect the directed exchange of information and asymmetry in the coupling of subsystems. The author motivates the development of transfer entropy by discussing the limitations of mutual information, which does not contain dynamical or directional information. Transfer entropy is derived from the concept of entropy rates and conditional probabilities, generalizing the Shannon entropy to multiple systems. It is non-symmetric, reflecting the directionality of information flow, and can be computed using kernel estimation for time series data. The paper demonstrates the application of transfer entropy through three examples: a one-dimensional lattice of unidirectionally coupled maps, the Ulam map with non-small coupling, and a bi-variate physiological time series of a sleeping human's breath rate and heart rate. In these examples, transfer entropy accurately captures the directed information flow, distinguishing it from static correlations due to common history or input signals. The author concludes that transfer entropy is a powerful tool for multivariate analysis of time series and the study of spatially extended systems, offering a more nuanced understanding of information dynamics compared to traditional methods.
Reach us at info@study.space
Understanding Measuring information transfer