Measuring Information Transfer

Measuring Information Transfer

February 7, 2008 | Thomas Schreiber
This paper introduces a new information-theoretic measure called transfer entropy, which quantifies the directed exchange of information between systems. Unlike mutual information, which lacks directionality, transfer entropy accounts for the dynamics of information transport and distinguishes between information actually exchanged and shared information due to common history or input signals. It is derived by conditioning transition probabilities appropriately, allowing it to detect asymmetry in the coupling of subsystems. The paper discusses the limitations of mutual information in capturing directional information and proposes transfer entropy as an alternative. It is shown that transfer entropy can quantify the exchange of information between two systems in both directions, and can be conditioned on common input signals. The concept is illustrated with examples, including a one-dimensional lattice of unidirectionally coupled maps and a bi-variate physiological time series. Transfer entropy is defined using conditional probabilities and is non-symmetric, measuring the dependence of one system on another. It is shown to be effective in detecting unidirectional information flow, even in complex systems. For example, in a lattice of maps, transfer entropy correctly identifies the direction of information flow, while mutual information does not. In a physiological time series, transfer entropy indicates a stronger flow of information from heart rate to breath rate than vice versa. The paper also addresses the use of transfer entropy in spatio-temporal systems, where it can be used to define the velocity of information transport. However, it is noted that the interpretation of such velocities can be challenged, as they may imply super-luminar communication. Preliminary results suggest that appropriate conditioning can resolve this paradox. The paper concludes that transfer entropy is a powerful tool for analyzing the directed exchange of information between systems, with applications in multivariate time series analysis and the study of spatially extended systems. It is also noted that the method is computationally feasible and can be applied to a wide range of systems.This paper introduces a new information-theoretic measure called transfer entropy, which quantifies the directed exchange of information between systems. Unlike mutual information, which lacks directionality, transfer entropy accounts for the dynamics of information transport and distinguishes between information actually exchanged and shared information due to common history or input signals. It is derived by conditioning transition probabilities appropriately, allowing it to detect asymmetry in the coupling of subsystems. The paper discusses the limitations of mutual information in capturing directional information and proposes transfer entropy as an alternative. It is shown that transfer entropy can quantify the exchange of information between two systems in both directions, and can be conditioned on common input signals. The concept is illustrated with examples, including a one-dimensional lattice of unidirectionally coupled maps and a bi-variate physiological time series. Transfer entropy is defined using conditional probabilities and is non-symmetric, measuring the dependence of one system on another. It is shown to be effective in detecting unidirectional information flow, even in complex systems. For example, in a lattice of maps, transfer entropy correctly identifies the direction of information flow, while mutual information does not. In a physiological time series, transfer entropy indicates a stronger flow of information from heart rate to breath rate than vice versa. The paper also addresses the use of transfer entropy in spatio-temporal systems, where it can be used to define the velocity of information transport. However, it is noted that the interpretation of such velocities can be challenged, as they may imply super-luminar communication. Preliminary results suggest that appropriate conditioning can resolve this paradox. The paper concludes that transfer entropy is a powerful tool for analyzing the directed exchange of information between systems, with applications in multivariate time series analysis and the study of spatially extended systems. It is also noted that the method is computationally feasible and can be applied to a wide range of systems.
Reach us at info@study.space
Understanding Measuring information transfer