Actions

Information Theory of the Heart Page: Difference between revisions

From Santa Fe Institute Events Wiki

No edit summary
(Replaced content with ''''Contact''': Hiroshi (hashika1@jhmi.edu)<br> Thank you for your interest in the project. Feel free to join us any time and share your thoughts!<br>')
 
(41 intermediate revisions by 3 users not shown)
Line 1: Line 1:
'''Contact''': Hiroshi (hashika1@jhmi.edu)<br>
'''Contact''': Hiroshi (hashika1@jhmi.edu)<br>
Thank you for your interest in the project. Feel free to join us any time and share your thoughts!<br>
Thank you for your interest in the project. Feel free to join us any time and share your thoughts!<br>
<br>
'''06/12/2014(Thu) Brainstorming Session'''<br>
''Why'': To identify adaptive mechanisms leading to cardiac arrhythmia in diseased hearts. Then localize the origin of arrhythmia (= wavebreak) and treat it before it happens.<br>
''Hypothesis'': Arrhythmias result from an adaptive mechanism to optimize information transmission in abnormal hearts.<br>
''Challenge'': Wavebreaks can also occur in normal heart tissue. Would information theory metrics be sensitive enough?<br>
''How'': Time series from a cellular automata model of the 2-D heart tissue. ?Time series of invasive electrogram from animal or human<br>
''What'': To quantify information transmission within the heart<br>
Potential Metrics<br>
Mutual information of binary time series (0–resting, 1–excited) in two locations. A high mutual information suggests electrical coupling.<br>
Permutation Entropy - Temporal uncertainty in one location. Time series in rational numbers.<br>
Transfer Entropy - Spatial uncertainty between two locations. Time series in binary numbers.<br>
<br>
'''06/13/2014(Fri) Unofficial Brainstorming Session'''<br>
Thank you all for suggesting excellent reference articles.
1. Cao ''et al''., Detecting Dynamical Changes in Time Series Using the Permutation Entropy. ''Phys Rev E Stat Nonlin Soft Matter Phys'' 70: 046217, 2004 [[File: 2004_CaoY_PRE.pdf]]<br>
2. Rosso ''et al''., Distinguishing Noise from Chaos. ''Phys Rev Lett'' 99: 154102, 2007 [[File: 2007_RossoOA_PRL.pdf]]<br>
3. Lacasa ''et al''., From Time Series to Complex Networks: The Visibility Graph. ''PNAS'' 105: 4972, 2008 [[File: 2008_LacasaL_PNAS.pdf]]<br>
4. Lizier ''et al''., Local Information Transfer as a Spatiotemporal Filter for Complex Systems. ''Phys Rev E Stat Nonlin Soft Matter Phys'' 77: 026110, 2008 [[File: 2008_LizierJT_PRL.pdf]]<br>
5. Cheong ''et al''., Information Transduction Capacity of Noisy Biochemical Signaling Networks. ''Science'' 334: 354, 2011 [[File:2011_Cheong_Science.pdf]]<br>
Let's read the reference papers above this weekend and discuss what exactly we should measure at the next meeting. The next meeting will be at 02:30pm on Monday 06/16/2014 (after my tutorial[http://tuvalu.santafe.edu/events/workshops/index.php/Complex_Systems_Summer_School_2014-Tutorials#Tutorial:_Rotors.2C_Spirals_and_Scroll_Waves]) in the lecture hall. If you have any questions I will be around this weekend. You will most certainly find me in the common room (or wherever TV is available) at 7pm on Saturday 06/14/2014 for FIFA World Cup Japan vs. Côte d'Ivoire!<br>
<br>
'''06/15/2014(Sun) Group Meeting'''<br>
Meeting with Josh @11:00am to discuss the project. Location: ARIEL<br> [[File: Ariel.jpg]].<br>
<br>
Thank you everyone for great discussions!<br>
''Summary of Information Theory Indices to Quantify Information Transfer within the Heart''<br>
''Question:'' So which one would help identify the location of wavebreaks (= intersection between wavefront and wavetail)?
1. Mutual Information [[File:MI3.png]]; [[File:MI4.png]]<br>
2. Conditional Mutual Information [[File:CMI2.png]]<br>
3. Transfer Entropy (from state Y to X) [[File:TE.png]]; [[File:Un.png]]<br>
= Conditional Mutual Information regarding the next state X' [[File:TE2.png]]<br>
= Global average of a Local Transfer Entropy at each observation [[File:TE3.png]]<br>
4. Local Transfer Entropy [[File:LTE.png]]; Binary time series.<br>
See Lizier ''et al.'', 2008 [[File: 2008_LizierJT_PRL.pdf]] and his recent book chapter[http://link.springer.com/chapter/10.1007%2F978-3-642-53734-9_5]<br>
5. Permutation Entropy [[File:PE.png]] where the sum runs over all n! permutations pi of order n. Real-value time series.<br>
See Bandt and Pompe, 2002 [[File: 2002_Bandt_PRE.pdf]] and Cao ''et al.'', 2004 [[File: 2004_CaoY_PRE.pdf]]<br>
6. Complexity-Entropy Causality Plane. See Rosso ''et al.'', 2007 [[File: 2007_RossoOA_PRL.pdf]]<br>
7. Visibility Graph: A simple rule to map real-valued time series into a network. See Lacasa ''et al.'', 2008 [[File: 2008_LacasaL_PNAS.pdf]]<br>
8. Graph Entropy: Quantifies structural information of a graph based on a derived probability distribution. See Dehmer, 2008 [[File:2008_Dehmer_AMC.pdf]]<br>

Latest revision as of 06:30, 3 July 2014

Contact: Hiroshi (hashika1@jhmi.edu)
Thank you for your interest in the project. Feel free to join us any time and share your thoughts!