|
|
| (31 intermediate revisions by 3 users not shown) |
| Line 1: |
Line 1: |
| '''Contact''': Hiroshi (hashika1@jhmi.edu)<br> | | '''Contact''': Hiroshi (hashika1@jhmi.edu)<br> |
| Thank you for your interest in the project. Feel free to join us any time and share your thoughts!<br> | | Thank you for your interest in the project. Feel free to join us any time and share your thoughts!<br> |
| <br>
| |
| '''06/12/2014(Thu) Brainstorming Session'''<br>
| |
| ''Why'': To identify adaptive mechanisms leading to cardiac arrhythmia in diseased hearts. Then localize the origin of arrhythmia (= wavebreak) and treat it before it happens.<br>
| |
| ''Hypothesis'': Arrhythmias result from an adaptive mechanism to optimize information transmission in abnormal hearts.<br>
| |
| ''Challenge'': Wavebreaks can also occur in normal heart tissue. Would information theory metrics be sensitive enough?<br>
| |
| ''How'': Time series from a cellular automata model of the 2-D heart tissue. ?Time series of invasive electrogram from animal or human<br>
| |
| ''What'': To quantify information transmission within the heart<br>
| |
| Potential Metrics<br>
| |
| Mutual information of binary time series (0–resting, 1–excited) in two locations. A high mutual information suggests electrical coupling.<br>
| |
| Permutation Entropy - Temporal uncertainty in one location. Time series in rational numbers.<br>
| |
| Transfer Entropy - Spatial uncertainty between two locations. Time series in binary numbers.<br>
| |
| <br>
| |
| '''06/13/2014(Fri) Unofficial Brainstorming Session'''<br>
| |
| Thank you all for suggesting excellent reference articles.
| |
|
| |
| 1. Cao ''et al''., Detecting Dynamical Changes in Time Series Using the Permutation Entropy. [[File: 2004_CaoY_PRE.pdf]] - Application of Permutation Entropy to EEG data<br>
| |
| 2. Rosso ''et al''., Distinguishing Noise from Chaos. [[File: 2007_RossoOA_PRL.pdf]] - Complexity–Entropy Causality plane<br>
| |
| 3. Lacasa ''et al''., From Time Series to Complex Networks: The Visibility Graph. [[File: 2008_LacasaL_PNAS.pdf]]<br>
| |
| 4. Lizier ''et al''., Local Information Transfer as a Spatiotemporal Filter for Complex Systems. [[File: 2008_LizierJT_PRL.pdf]] - Local Transfer Entropy<br>
| |
| 5. Cheong ''et al''., Information Transduction Capacity of Noisy Biochemical Signaling Networks. [[File:2011_Cheong_Science.pdf]] - Not quite relevance but important for future applications.<br>
| |
|
| |
| Let's read the reference papers above this weekend and discuss what exactly we should measure at the next meeting. The next meeting will be at 02:30pm on Monday 06/16/2014 (after my tutorial[http://tuvalu.santafe.edu/events/workshops/index.php/Complex_Systems_Summer_School_2014-Tutorials#Tutorial:_Rotors.2C_Spirals_and_Scroll_Waves]) in the lecture hall. If you have any questions I will be around this weekend. You will most certainly find me in the common room (or wherever TV is available) at 7pm on Saturday 06/14/2014 for FIFA World Cup Japan vs. Côte d'Ivoire!<br>
| |
| <br>
| |
| '''06/15/2014(Sun) Group Meeting'''<br>
| |
| Meeting with Josh @11:00am to discuss the project. Location: ARIEL<br> [[File: Ariel.jpg]].<br>
| |
|
| |
| <br>
| |
| Thank you everyone for great discussions!<br>
| |
| <br>
| |
| ''Summary of Information Theory Indices to Quantify Information Transfer within the Heart''<br>
| |
| ''Question:'' So which one would help identify the location of wavebreaks (= intersection between wavefront and wavetail)?<br>
| |
| 1. Mutual Information [[File:MI3.png]]; [[File:MI4.png]]<br>
| |
| 2. Conditional Mutual Information [[File:CMI2.png]]<br>
| |
| 3. Transfer Entropy (from state Y to X) [[File:TE.png]]; [[File:Un.png]]<br>
| |
| = Conditional Mutual Information regarding the next state X' [[File:TE2.png]]<br>
| |
| = Global average of a Local Transfer Entropy at each observation [[File:TE3.png]]<br>
| |
| 4. Local Transfer Entropy [[File:LTE.png]]; Binary time series.<br>
| |
| See Lizier [[File: 2008_LizierJT_PRL.pdf]] and his recent book chapter[http://link.springer.com/chapter/10.1007%2F978-3-642-53734-9_5]<br>
| |
| 5. Permutation Entropy [[File:PE.png]] where the sum runs over all n! permutations pi of order n. Real-value time series.<br>
| |
| See Bandt and Pompe [[File: 2002_Bandt_PRE.pdf]] and Cao [[File: 2004_CaoY_PRE.pdf]]<br>
| |
| 6. Complexity-Entropy Causality Plane. See Rosso [[File: 2007_RossoOA_PRL.pdf]]<br>
| |
| 7. Visibility Graph: A simple rule to map real-valued time series into a network. See Lacasa [[File: 2008_LacasaL_PNAS.pdf]]<br>
| |
| 8. Graph Entropy: Quantifies structural information of a graph based on a derived probability distribution. See Dehmer [[File:2008_Dehmer_AMC.pdf]]<br>
| |
|
| |
| ''Project Milestones''<br>
| |
| I arbitrarily created 3 milestones to split the group into 3 subgroups so everyone can contribute to one aspect of the project. However, at this stage you could consider these milestones a rough idea; I am sure there will be more to do!<br>
| |
|
| |
| 1. Build a 2-D cellular automata (CA) model of realistic heart tissue (Matlab)<br>
| |
| ''Available Resources:''<br>
| |
| - Published model [[File:2005_Atizenza.pdf]]<br>
| |
| - Netlogo CA code (in dropbox); Original code is here[http://ccl.northwestern.edu/netlogo/models/community/SimHeart]<br>
| |
| - Matlab Central: Spiral waves (non-CA)[http://www.mathworks.com/matlabcentral/fileexchange/22492-spiral-waves-in-monodomain-reaction-diffusion-model]<br>
| |
|
| |
| 2. Extract time series from the CA model, modify the existing codes to calculate information theory indices specific to the CA model.<br>
| |
| ''Available Resources:''<br>
| |
| - Transfer Entropy Toolbox: Matlab code [https://code.google.com/p/transfer-entropy-toolbox/]; ref [[File:2011_ItoS_PLOS.pdf]]<br>
| |
| - Information Dynamics Toolkit: Java code (usable in Matlab) for local transfer entropy, etc. [https://code.google.com/p/information-dynamics-toolkit/]<br>
| |
|
| |
| 3. Statistical analysis, visualization and interpretation of the results<br>
| |