Measuring Information Transfer: Difference between revisions
From Santa Fe Institute Events Wiki
No edit summary |
|||
(2 intermediate revisions by 2 users not shown) | |||
Line 5: | Line 5: | ||
Dependencies between variables or processes in systems are often quantified using correlation coefficients (income <-> height, etc.). However, correlation only captures linear dependencies. In addition the dynamical aspect (e.g. common history) and the direction of information transfer in the processes under consideration are neglected. A recently proposed measure called '''Transfer Entropy''' was specifically designed to overcome those limitations by the clever use of conditioned transition probabilities. | Dependencies between variables or processes in systems are often quantified using correlation coefficients (income <-> height, etc.). However, correlation only captures linear dependencies. In addition the dynamical aspect (e.g. common history) and the direction of information transfer in the processes under consideration are neglected. A recently proposed measure called '''Transfer Entropy''' was specifically designed to overcome those limitations by the clever use of conditioned transition probabilities. | ||
In theory, Transfer Entropy, is able to quantify non-/linear directed dependencies in any dynamical system. In practice, the transition probabilities/-densities have to be estimated from the data. That's the reason, why we need your data! If you happen to have good multivariate time series data (e.g. bio-chemical/medical/logical dynamical systems, longitudinal studies in sociology, etc.), which you expect to be holding interesting dependencies of any kind, let us know. | In theory, Transfer Entropy, is able to quantify non-/linear directed dependencies in any dynamical system. In practice, the transition probabilities/-densities have to be estimated from the data. | ||
That's the reason, why '''we need your data'''! If you happen to have good multivariate time series data (e.g. bio-chemical/medical/logical dynamical systems, longitudinal studies in sociology, etc.), which you expect to be holding interesting dependencies of any kind, let us know. | |||
==Team members (feel free to join)== | ==Team members (feel free to join)== | ||
* | * [[Juergen_Pahle]] | ||
* | * [[Joseph_Lizier]] | ||
==Readings== | ==Readings== | ||
* T. Schreiber (2000) Measuring information transfer. Phys. Rev. Lett '''85'''(2), 461-4 | * T. Schreiber (2000) Measuring information transfer. Phys. Rev. Lett '''85'''(2), 461-4 | ||
Download at [[http://prola.aps.org/abstract/PRL/v85/i2/p461_1]] (requires access to PRL). | |||
==Questions to answer== | ==Questions to answer== | ||
Line 23: | Line 24: | ||
* What can we learn about (possibly nonlinear) dependencies between variables in biomedical, ecological, sociological ... systems | * What can we learn about (possibly nonlinear) dependencies between variables in biomedical, ecological, sociological ... systems | ||
* How does information transfer measured by the transfer entropy breaks down in systems, on which damage was inflicted (ischemia in heart, extinction of species in dynamical foodwebs etc.) | * How does information transfer measured by the transfer entropy breaks down in systems, on which damage was inflicted (ischemia in heart, extinction of species in dynamical foodwebs etc.) | ||
* ... | |||
==Ideas== | |||
* Measure information transfer in the heart model by Vikas, Simon, Nathan & Co. | |||
* ... | * ... |
Latest revision as of 01:23, 19 June 2007
CSSS Santa Fe 2007 |
Background
Dependencies between variables or processes in systems are often quantified using correlation coefficients (income <-> height, etc.). However, correlation only captures linear dependencies. In addition the dynamical aspect (e.g. common history) and the direction of information transfer in the processes under consideration are neglected. A recently proposed measure called Transfer Entropy was specifically designed to overcome those limitations by the clever use of conditioned transition probabilities.
In theory, Transfer Entropy, is able to quantify non-/linear directed dependencies in any dynamical system. In practice, the transition probabilities/-densities have to be estimated from the data.
That's the reason, why we need your data! If you happen to have good multivariate time series data (e.g. bio-chemical/medical/logical dynamical systems, longitudinal studies in sociology, etc.), which you expect to be holding interesting dependencies of any kind, let us know.
Team members (feel free to join)
Readings
- T. Schreiber (2000) Measuring information transfer. Phys. Rev. Lett 85(2), 461-4
Download at [[1]] (requires access to PRL).
Questions to answer
- What can we learn about (possibly nonlinear) dependencies between variables in biomedical, ecological, sociological ... systems
- How does information transfer measured by the transfer entropy breaks down in systems, on which damage was inflicted (ischemia in heart, extinction of species in dynamical foodwebs etc.)
- ...
Ideas
- Measure information transfer in the heart model by Vikas, Simon, Nathan & Co.
- ...