https://wiki.santafe.edu/api.php?action=feedcontributions&user=Chaos&feedformat=atomSanta Fe Institute Events Wiki - User contributions [en]2023-02-06T15:34:02ZUser contributionsMediaWiki 1.37.1https://wiki.santafe.edu/index.php?title=Complex_Systems_Summer_School_2011-Software&diff=40721Complex Systems Summer School 2011-Software2011-06-22T21:39:19Z<p>Chaos: /* Nonlinearity */</p>
<hr />
<div>{{Complex Systems Summer School 2011}}<br />
<br />
Please download these programs for use during the Summer School.<br />
<br />
==General Software==<br />
<br />
NetLogo: [http://ccl.northwestern.edu/netlogo/ Download Link]<br />
<br />
[http://backspaces.net/wiki/NetLogo_Tutorial basic netlogo tutorial]<br />
<br />
R Statistical Programming Language: [http://cran.opensourceresources.org/ Download Link]<br />
<br />
==Nonlinearity==<br />
<br />
TISEAN 3.0.1 Nonlinear Time Series Analysis Software [http://www.mpipks-dresden.mpg.de/~tisean/Tisean_3.0.1/index.html Download Link]<br />
<br />
<br />
[[User:X14n|X14n]] 01:10, 3 June 2011 (UTC) says:<br />
<br />
You might be interested in the following TISEAN-related R packages (the links from the RISEAN page are broken):<br />
* [http://cran.r-project.org/web/packages/tsDyn/index.html tsDyn]<br />
* [http://cran.r-project.org/web/packages/RTisean/index.html RTisean]<br />
<br />
The labs for the Symbolic Dynamics lectures are available via your web browser. Go to the Links in the Nonlinearity Module page.<br />
<br />
==Networks==<br />
<br />
[http://gephi.org/ Gephi Network Analysis Software]<br />
<br />
[http://www.umich.edu/~mejn/netdata/ Network Datasets]<br />
<br />
[http://www.cytoscape.org/ Cytoscape network analysis software]<br><br />
Cytoscape is another package people might want to play around with.</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40720Module:Complexity2011-06-22T21:35:50Z<p>Chaos: /* Lecture Notes */</p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, <i>Elements of Information Theory</i>,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, <i>An Introduction to Kolmogorov Complexity and its Applications </i>,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
</ul><br />
<br />
==Lecture Notes==<br />
<ul><br />
<li>Thursday<br />
<ol><br />
<li>Processes and Their Models: [[Media:ComplexityLecture1A.pdf |Complexity Lecture 1A (PDF)]]<br />
<li>Information Theory: [[Media:ComplexityLecture1B.pdf |Complexity Lecture 1B (PDF)]]<br />
<li>Information in Processes: [[Media:ComplexityLecture1C.pdf |Complexity Lecture 1C (PDF)]]<br />
<li>Memory in Processes: [[Media:ComplexityLecture1D.pdf |Complexity Lecture 1D (PDF)]]<br />
</ol><br />
<li>Friday<br />
<ol><br />
<li>The Learning Channel: [[Media:ComplexityLecture2A.pdf |Complexity Lecture 2A (PDF)]]<br />
<li>Causal Models: [[Media:ComplexityLecture2B.pdf |Complexity Lecture 2B (PDF)]]<br />
<li>Measures of Complexity: [[Media:ComplexityLecture2C.pdf |Complexity Lecture 2C (PDF)]]<br />
<li>Applications: [[Media:ComplexityLecture2D.pdf |Complexity Lecture 2D (PDF)]]<br />
</ol><br />
</ul><br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1010.5545 Many Roads to Synchrony: Natural Time Scales and Their Algorithms]'''<br />
<br />
'''[http://arxiv.org/abs/1007.5354 Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation]'''<br />
<br />
'''[http://arxiv.org/abs/0905.4787 Information Accessibility and Cryptic Processes]'''<br />
<br />
'''[http://arxiv.org/abs/0905.3587 Prediction, Retrodiction, and The Amount of Information Stored in the Present]'''<br />
<br />
'''[http://arxiv.org/abs/0806.4789 The Organization of Intrinsic Computation: Complexity-Entropy Diagrams and the Diversity of Natural Information Processing]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Online Labs==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40719Module:Complexity2011-06-22T21:33:11Z<p>Chaos: </p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, <i>Elements of Information Theory</i>,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, <i>An Introduction to Kolmogorov Complexity and its Applications </i>,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
</ul><br />
<br />
==Lecture Notes==<br />
<ul><br />
<li>Thursday<br />
<ol><br />
<li>Processes and Their Models: [[Media:ComplexityLecture1A.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information Theory: [[Media:ComplexityLecture1B.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information in Processes: [[Media:ComplexityLecture1C.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Memory in Processes: [[Media:ComplexityLecture1D.pdf |Complexity Lecture 1 (PDF)]]<br />
</ol><br />
<li>Friday<br />
<ol><br />
<li>The Learning Channel: [[Media:ComplexityLecture2A.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Causal Models: [[Media:ComplexityLecture2B.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Measures of Complexity: [[Media:ComplexityLecture2C.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Applications: [[Media:ComplexityLecture2D.pdf |Complexity Lecture 2 (PDF)]]<br />
</ol><br />
</ul><br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1010.5545 Many Roads to Synchrony: Natural Time Scales and Their Algorithms]'''<br />
<br />
'''[http://arxiv.org/abs/1007.5354 Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation]'''<br />
<br />
'''[http://arxiv.org/abs/0905.4787 Information Accessibility and Cryptic Processes]'''<br />
<br />
'''[http://arxiv.org/abs/0905.3587 Prediction, Retrodiction, and The Amount of Information Stored in the Present]'''<br />
<br />
'''[http://arxiv.org/abs/0806.4789 The Organization of Intrinsic Computation: Complexity-Entropy Diagrams and the Diversity of Natural Information Processing]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Online Labs==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=Complex_Systems_Summer_School_2011_(CSSS)&diff=40718Complex Systems Summer School 2011 (CSSS)2011-06-22T21:32:13Z<p>Chaos: </p>
<hr />
<div>{{Complex Systems Summer School 2011}}<br />
<br />
Welcome to the 2011 Complex Systems Summer School! <br />
<br />
Please stay tuned as we add more content to these pages.<br />
<br />
We're looking forward to seeing everyone on June 8th.<br />
<br />
David Krakauer, <br />
<br />
Program Director, 2011 SFI Complex Systems Summer School<br />
<br />
'''Please sign up for [[Alfred Hubler's Nonlinear Dynamics Lab 2011 | Alfred Hubler's Nonlinear Dynamics Lab]]'''<br />
<br />
<br />
<br />
==WHAT'S NEW?!?!==<br />
<br />
15:00, 22 June 2011 (UTC) Complexity Module slides, online labs, and background reading updated. Start playing with the labs!<br />
<br />
17:46, 16 June 2011 (UTC) Cris Moore slides up.<br />
<br />
15:02, 14 June 2011 (UTC) GML Network of Research Interests Posted under "Projects & Working Groups" <br />
<br />
22:14, 13 June 2011 (UTC) Machine Learning readings up. <br />
<br />
17:20, 13 June 2011 (UTC) Robustness readings up on Robustness module page<br />
<br />
15:37, 13 June 2011 (UTC) Intro Netlogo tutorial linked off of software page<br />
<br />
15:15, 13 June 2011 (UTC) New software on software page for networks module.<br />
<br />
<br />
<br />
<br />
The Participants pages are up! If you haven't done so yet, please login to edit your wiki stub.<br />
At the very least, answer the questions below and post a photo.<br />
<br />
Start thinking about what (if any) tutorials or lectures you might want to give to your CSSS-mates, or what kinds of working groups you'd like to see organized.<br />
<br />
Questions to get you started:<br />
* What are your main interests? Feel free to include a "pie in the sky" big idea! <br />
* What sort of expertise can you bring to the group?<br />
* What do you hope to get out of the CSSS?<br />
* Do you have any possible projects in mind for the CSSS?<br />
<br />
Also, Please visit [http://www.facebook.com/pages/SFI-Complex-Systems-Summer-School/195552467134324?sk=wall&filter=2 the Complex Systems Summer School Facebook Page]</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40693Module:Complexity2011-06-22T17:02:44Z<p>Chaos: /* Links */</p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, Elements of Information Theory,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, An Introduction to Kolmogorov Complexity and its Applications,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
</ul><br />
<br />
==Lecture Notes==<br />
<ul><br />
<li>Thursday<br />
<ol><br />
<li>Processes and Their Models: [[Media:ComplexityLecture1A.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information Theory: [[Media:ComplexityLecture1B.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information in Processes: [[Media:ComplexityLecture1C.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Memory in Processes: [[Media:ComplexityLecture1D.pdf |Complexity Lecture 1 (PDF)]]<br />
</ol><br />
<li>Friday<br />
<ol><br />
<li>The Learning Channel: [[Media:ComplexityLecture2A.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Causal Models: [[Media:ComplexityLecture2B.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Measures of Complexity: [[Media:ComplexityLecture2C.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Applications: [[Media:ComplexityLecture2D.pdf |Complexity Lecture 2 (PDF)]]<br />
</ol><br />
</ul><br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1010.5545 Many Roads to Synchrony: Natural Time Scales and Their Algorithms]'''<br />
<br />
'''[http://arxiv.org/abs/1007.5354 Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation]'''<br />
<br />
'''[http://arxiv.org/abs/0905.4787 Information Accessibility and Cryptic Processes]'''<br />
<br />
'''[http://arxiv.org/abs/0905.3587 Prediction, Retrodiction, and The Amount of Information Stored in the Present]'''<br />
<br />
'''[http://arxiv.org/abs/0806.4789 The Organization of Intrinsic Computation: Complexity-Entropy Diagrams and the Diversity of Natural Information Processing]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Online Labs==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40692Module:Complexity2011-06-22T17:02:08Z<p>Chaos: /* Readings */</p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, Elements of Information Theory,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, An Introduction to Kolmogorov Complexity and its Applications,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
</ul><br />
<br />
==Lecture Notes==<br />
<ul><br />
<li>Thursday<br />
<ol><br />
<li>Processes and Their Models: [[Media:ComplexityLecture1A.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information Theory: [[Media:ComplexityLecture1B.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information in Processes: [[Media:ComplexityLecture1C.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Memory in Processes: [[Media:ComplexityLecture1D.pdf |Complexity Lecture 1 (PDF)]]<br />
</ol><br />
<li>Friday<br />
<ol><br />
<li>The Learning Channel: [[Media:ComplexityLecture2A.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Causal Models: [[Media:ComplexityLecture2B.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Measures of Complexity: [[Media:ComplexityLecture2C.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Applications: [[Media:ComplexityLecture2D.pdf |Complexity Lecture 2 (PDF)]]<br />
</ol><br />
</ul><br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1010.5545 Many Roads to Synchrony: Natural Time Scales and Their Algorithms]'''<br />
<br />
'''[http://arxiv.org/abs/1007.5354 Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation]'''<br />
<br />
'''[http://arxiv.org/abs/0905.4787 Information Accessibility and Cryptic Processes]'''<br />
<br />
'''[http://arxiv.org/abs/0905.3587 Prediction, Retrodiction, and The Amount of Information Stored in the Present]'''<br />
<br />
'''[http://arxiv.org/abs/0806.4789 The Organization of Intrinsic Computation: Complexity-Entropy Diagrams and the Diversity of Natural Information Processing]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Links==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40691Module:Complexity2011-06-22T17:01:45Z<p>Chaos: /* Readings */</p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, Elements of Information Theory,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, An Introduction to Kolmogorov Complexity and its Applications,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
</ul><br />
<br />
==Lecture Notes==<br />
<ul><br />
<li>Thursday<br />
<ol><br />
<li>Processes and Their Models: [[Media:ComplexityLecture1A.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information Theory: [[Media:ComplexityLecture1B.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information in Processes: [[Media:ComplexityLecture1C.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Memory in Processes: [[Media:ComplexityLecture1D.pdf |Complexity Lecture 1 (PDF)]]<br />
</ol><br />
<li>Friday<br />
<ol><br />
<li>The Learning Channel: [[Media:ComplexityLecture2A.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Causal Models: [[Media:ComplexityLecture2B.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Measures of Complexity: [[Media:ComplexityLecture2C.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Applications: [[Media:ComplexityLecture2D.pdf |Complexity Lecture 2 (PDF)]]<br />
</ol><br />
</ul><br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1010.5545 Many Roads to Synchrony: Natural Time Scales and Their Algorithms]'''<br />
<br />
'''[hhttp://arxiv.org/abs/1007.5354 Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation]'''<br />
<br />
'''[http://arxiv.org/abs/0905.4787 Information Accessibility and Cryptic Processes]'''<br />
<br />
'''[http://arxiv.org/abs/0905.3587 Prediction, Retrodiction, and The Amount of Information Stored in the Present]'''<br />
<br />
'''[http://arxiv.org/abs/0806.4789 The Organization of Intrinsic Computation: Complexity-Entropy Diagrams and the Diversity of Natural Information Processing]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Links==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=File:ComplexityLecture2D.pdf&diff=40687File:ComplexityLecture2D.pdf2011-06-22T16:57:13Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:ComplexityLecture2C.pdf&diff=40685File:ComplexityLecture2C.pdf2011-06-22T16:56:57Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:ComplexityLecture2B.pdf&diff=40684File:ComplexityLecture2B.pdf2011-06-22T16:56:41Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:ComplexityLecture2A.pdf&diff=40683File:ComplexityLecture2A.pdf2011-06-22T16:56:21Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:ComplexityLecture1D.pdf&diff=40682File:ComplexityLecture1D.pdf2011-06-22T16:56:01Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:ComplexityLecture1C.pdf&diff=40681File:ComplexityLecture1C.pdf2011-06-22T16:55:45Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:ComplexityLecture1B.pdf&diff=40680File:ComplexityLecture1B.pdf2011-06-22T16:55:28Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:ComplexityLecture1A.pdf&diff=40679File:ComplexityLecture1A.pdf2011-06-22T16:55:09Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40678Module:Complexity2011-06-22T16:54:28Z<p>Chaos: /* Lecture Notes */</p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, Elements of Information Theory,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, An Introduction to Kolmogorov Complexity and its Applications,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
</ul><br />
<br />
==Lecture Notes==<br />
<ul><br />
<li>Thursday<br />
<ol><br />
<li>Processes and Their Models: [[Media:ComplexityLecture1A.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information Theory: [[Media:ComplexityLecture1B.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Information in Processes: [[Media:ComplexityLecture1C.pdf |Complexity Lecture 1 (PDF)]]<br />
<li>Memory in Processes: [[Media:ComplexityLecture1D.pdf |Complexity Lecture 1 (PDF)]]<br />
</ol><br />
<li>Friday<br />
<ol><br />
<li>The Learning Channel: [[Media:ComplexityLecture2A.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Causal Models: [[Media:ComplexityLecture2B.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Measures of Complexity: [[Media:ComplexityLecture2C.pdf |Complexity Lecture 2 (PDF)]]<br />
<li>Applications: [[Media:ComplexityLecture2D.pdf |Complexity Lecture 2 (PDF)]]<br />
</ol><br />
</ul><br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Links==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40677Module:Complexity2011-06-22T16:51:50Z<p>Chaos: /* Lecture Notes */</p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, Elements of Information Theory,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, An Introduction to Kolmogorov Complexity and its Applications,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
</ul><br />
<br />
==Lecture Notes==<br />
<ul><br />
<li>Thursday<br />
<ol><br />
<li>Processes and Their Models<br />
<li>Information Theory<br />
<li>Information in Processes<br />
<li>Memory in Processes<br />
</ol><br />
<li>Friday<br />
<ol><br />
<li>The Learning Channel<br />
<li>Causal Models<br />
<li>Measures of Complexity<br />
<li>Applications<br />
</ol><br />
</ul><br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Links==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40676Module:Complexity2011-06-22T16:48:48Z<p>Chaos: </p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, Elements of Information Theory,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, An Introduction to Kolmogorov Complexity and its Applications,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
</ul><br />
<br />
==Lecture Notes==<br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Links==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40674Module:Complexity2011-06-22T16:47:39Z<p>Chaos: /* Background */</p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, Elements of Information Theory,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, An Introduction to Kolmogorov Complexity and its Applications,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
</ul><br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Links==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Complexity&diff=40673Module:Complexity2011-06-22T16:47:22Z<p>Chaos: /* Background */</p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}}<br />
<br />
Organized by [[Jim Crutchfield]]<br />
<br />
==Background==<br />
<ul><br />
<li><br />
T. Cover and J.Thomas, Elements of Information Theory,<br />
Wiley, Second Edition (2006) Chapters 1 - 7.<br />
<li><br />
M. Li and P.M.B.Vitanyi, An Introduction to Kolmogorov Complexity and its Applications,<br />
Springer, New York (1993).<br />
<li><br />
J. P. Crutchfield and D. P. Feldman,<br />
“Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS<br />
<b>13</b>:1 (2003) 25-54.<br />
<\ul><br />
<br />
==Readings==<br />
<br />
'''[http://tuvalu.santafe.edu/~cmg/compmech/pubs/CalcEmergTitlePage.html The Calculi of Emergence]'''<br />
<br />
'''[http://arxiv.org/abs/cs/0001027 Pattern Discovery and Computational Mechanics]'''<br />
<br />
'''[http://arxiv.org/abs/1105.2988 Anatomy of a Bit]'''<br />
<br />
==Links==<br />
<br />
'''[http://172.29.16.101:8000 CMPy Notebook]'''</div>Chaoshttps://wiki.santafe.edu/index.php?title=Complex_Systems_Summer_School_2011-Schedule&diff=40672Complex Systems Summer School 2011-Schedule2011-06-22T16:44:12Z<p>Chaos: </p>
<hr />
<div>{{Complex Systems Summer School 2011}}<br />
<br />
<!-- put content below here --><br />
<br />
==Module 1: Nonlinear Dynamics and Modeling ==<br />
<br />
<br />
{| border="1" cellpadding="2"<br />
! scope="col"width="100" align="center | Time<br />
! scope="col"width="700" align="center | Activity<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
| <br/><br />
| <b>Wednesday, June 8</b><br />
|-<br />
|<br />
|<b>All events at St. John's College Unless Otherwise Noted</b><br />
|-<br />
|width="15%" | 12:00 p.m. - 5:00 p.m.<br />
|Registration and check-in at St. John's.<br />
|-<br />
|5:30 p.m. - 8:30 p.m.<br />
|Welcome Reception & Buffet Dinner with David Krakauer, Director, CSSS2011<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
| <br/><br />
| <b>Thursday, June 9</b><br />
<br />
|- <br />
| |8:45 a.m - 9:00 a.m.<br />
|Opening Remarks: Jerry Sabloff, President, Santa Fe Institute<br />
|-<br />
<br />
|- <br />
|width="15%" |9:00 a.m. - 10:15 a.m.<br />
|Liz Bradley, Nonlinear Dynamics I: Maps<br />
<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
| 10:45 a.m. - 12:00 p.m.<br />
|Liz Bradley, Nonlinear Dynamics II: Flows<br />
|- <br />
|12:00 p.m. - 1:30 p.m.<br />
|Lunch<br />
<br />
|- <br />
|1:30 p.m. - 2:45 p.m.<br />
|Jim Crutchfield, Symbolics I <br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|- <br />
|3:00 p.m. - 4:15 p.m. <br />
|Josh Garland, Computational Lab <br />
|-<br />
|4:15 p.m. - 5:00 p.m.<br />
|Names and Faces<br />
|-<br />
|5:00 - 6:30 p.m. <br />
|Dinner<br />
|-<br />
|5:30 - 6:45 p.m.<br />
|Geoffrey West, Lecture, "Searching for Simplicity and Unity in Complexity from Cells to Cities" <br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Friday, June 10</b><br />
|- <br />
|9:00 a.m - 10:15 a.m. <br />
|Liz Bradley, Nonlinear Dynamics III: Tools<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|Josh Garland, Computational Lab II<br />
|- <br />
|12:00 p.m. - 1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m.<br />
|Liz Bradley Nonlinear Dynamics IV: Applications<br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|- <br />
|3:00 p.m. - 4:15 p.m. <br />
|Jim Crutchfield, Symbolics II<br />
|-<br />
|5:00 p.m. - 6:00 p.m.<br />
|Dinner<br />
|-<br />
|6:00 p.m.<br />
|Shuttle to Santa Fe Complex<br />
|-<br />
|6:00 p.m. - 7:45 p.m.<br />
|Field Trip to the [http://www.sfcomplex.org Santa Fe Complex]. Hosts [http://www.redfish.com/stephen.htm Stephen Guerin] and [http://www.redfish.com/owenDensmore.htm Owen Densmore].<br />
|-<br />
|8:00 p.m.<br />
|Return shuttle to St. John's College<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br />
| <b>Saturday, June 11</b><br />
<br />
|-<br />
|<br />
|<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br />
| <b>Sunday, June 12</b><br />
<br />
|-<br />
|6:30 ~ ?<br />
|Tom Carter, Modeling Etc.<br />
|}<br />
<br />
== Module 2: Networks ==<br />
{| border="1" cellpadding="2"<br />
! scope="col"width="100" align="center | Time<br />
! scope="col"width="700" align="center | Activity<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Monday, June 13</b><br />
<br />
|- <br />
|width="15%" |9:00 a.m - 10:15 a.m.<br />
|Mark Newman, Networks I<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|Mark Newman, Networks II<br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m.<br />
|Jennifer Dunne, Foodwebs I<br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|- <br />
|3:00 p.m. - 4:15 p.m. <br />
|Jennifer Dunne, Foodwebs II<br />
|-<br />
|5:00 p.m.<br />
|Dinner<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Tuesday, June 14</b><br />
|- <br />
|9:00 a.m - 10:15 a.m.<br />
|Mark Newman, Networks III<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|Cris Moore<br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m.<br />
|Mark Newman, Networks IV<br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|- <br />
|3:00 p.m. - 4:15 p.m. <br />
|Networks Practical<br />
|-<br />
|5:00 p.m. - 6:15 p.m. <br />
|Dinner<br />
|-<br />
|6:15 p.m. - 7:30 p.m.<br />
|Alfred Hübler, Nonlinear Physics (In Action!)<br />
|}<br />
<br />
==Module 3: Computation==<br />
<br />
{| border="1" cellpadding="2"<br />
! scope="col"width="100" align="center | Time<br />
! scope="col"width="700" align="center | Activity<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Wednesday, June 15</b><br />
|-<br />
|<br />
|<b>All events at SFI</b><br />
|-<br />
|8:30 a.m.<br />
|Shuttles to SFI - meet at the Visitor's Circle at St. John's College<br />
|-<br />
|width="15%" |9:00 a.m. - 10:15 a.m.<br />
|Cris Moore<br />
<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break <br />
|- <br />
|10:45 a.m. - 12:00 p.m.<br />
|Alex Russel<br />
|- <br />
|12:00 p.m. - 1:45 p.m.<br />
|Lunch<br />
|- <br />
|1:45 p.m. - 3:00 p.m.<br />
|Jared Saia<br />
|- <br />
|3:00 p.m. - 3:30 p.m.<br />
|Tea with SFI Community<br />
|- <br />
|3:30 p.m. - 4:45 p.m. <br />
|Time to work on Challenge Questions<br />
|-<br />
|4:45 p.m. - 6:00 p.m.<br />
|Welcome barbecue at SFI<br />
|-<br />
|6:00 p.m. - 7:00 p.m.<br />
|John Harte<br />
|-<br />
|7:00 p.m.<br />
|Return shuttle to St. John's College<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Thursday, June 16</b><br />
|- <br />
|9:00 a.m - 10:15 a.m.<br />
|Cris Moore<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|Alex Russel<br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m.<br />
|Jared Saia <br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|- <br />
|3:00 p.m. - 4:15 p.m. <br />
|Time to work on Challenge Questions<br />
|- <br />
|7:00 p.m. - 8:30 p.m. <br />
|Tom Carter - Probability / fractals / etc.<br />
|}<br />
<br />
==Module 4: Robustness==<br />
<b>Organized by Jessica Flack</b><br />
<br />
{| border="1" cellpadding="2"<br />
! scope="col"width="100" align="center | Time<br />
! scope="col"width="700" align="center | Activity<br />
<br />
|- bgcolor="#aaaaaa" align="center" <br />
|<br><br />
| <b>Friday, June 17</b><br />
|- <br />
|width="15%" |9:00 a.m - 10:15 a.m.<br />
|Jessica Flack<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|Jessica Flack<br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|-<br />
|1:30 p.m. - 2:45 p.m.<br />
|Nihat Ay<br />
|-<br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|-<br />
|3:00 p.m. - 4:15 p.m.<br />
|Nihat Ay<br />
|-<br />
|5:00 p.m. - 6:30 p.m.<br />
|Pizza with David Krakauer, Bridging Concepts <br />
|-<br />
|5:30 p.m. - 6:30 p.m.<br />
|Special Guest, Andrew Lovato, Santa Fe History<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
| <br><br />
|<b>Saturday, June 18</b><br />
|-<br />
|All Day<br />
|Bandelier Field Trip<br />
|-<br />
|7:00 p.m.<br />
|CSSS/Bread Loaf School of English Mixer in St. John's College coffee shop.<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Sunday, June 19</b><br />
|-<br />
|<br />
|<br />
<br />
|- bgcolor="#aaaaaa" align="center" <br />
|<br><br />
| <b>Monday, June 20</b><br />
|- <br />
|9:00 a.m - 10:15 a.m.<br />
|Dave Ackley<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|Dave Ackley<br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m.<br />
|Jessica Flack<br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|- <br />
|3:00 p.m. - 4:15 p.m. <br />
|Jessica Flack<br />
|-<br />
|4:15 p.m.<br />
|Voting on CSSS2011 [http://tuvalu.santafe.edu/events/workshops/index.php/Complex_Systems_Summer_School_2011-TShirts T-shirt design] **RESCHEDULED TO TOMORROW**<br />
<br />
|-<br />
|5:00 - 6:30 p.m.<br />
|Dinner<br />
|- <br />
|7:00 p.m. - 8:30-ish <br />
|Tom Carter -- High Finance (Black-Scholes), eine kleine symbolic dynamics, a strange repeller, and . . . <br> come when you can . . .<br />
|}<br />
<br />
==Module 5: Evolution==<br />
<b>Organized by David Krakauer</b><br />
{| border="1" cellpadding="2"<br />
! scope="col"width="100" align="center | Time<br />
! scope="col"width="700" align="center | Activity<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Tuesday, June 21</b><br />
<br />
|- <br />
|width="15%" |9:00 a.m - 10:15 a.m.<br />
|David Krakauer<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|David Krakauer<br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m.<br />
|Jeremy Van Cleve<br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|-<br />
|3:00p.m.<br />
|T-shirt voting!<br />
<br />
|- <br />
|3:00 p.m. - 4:15 p.m. <br />
|Free <br />
|-<br />
|5:00 p.m.<br />
|Dinner<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Wednesday, June 22</b><br />
|-<br />
|<br />
|<b>All events at SFI</b><br />
|- <br />
|8:30 a.m.<br />
|Shuttles to SFI<br />
|- <br />
|width="15%" |9:00 a.m - 10:15 a.m.<br />
|Tanmoy Bhattacharya <br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 a.m. - 12:00 p.m.<br />
|Laura Fortunato<br />
|- <br />
|12:00 p.m. - 1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 3:00 p.m. <br />
|Practical<br />
|- <br />
|3:00 p.m. - 3:30 p.m.<br />
|Tea with SFI Community <br />
|- <br />
|3:30 p.m. - 4:45 p.m. <br />
|Practical<br />
|-<br />
|5:00 p.m.<br />
|Return shuttle to St. John's College<br />
|-<br />
|5:00 p.m.<br />
|Dinner<br />
|-<br />
|6:00 p.m.<br />
|Rodeo trip w/ [[JP]] (leave from SFI)<br />
|}<br />
<br />
==Module 6: Complexity==<br />
<b>Organized by Jim Crutchfield</b><br />
{| border="1" cellpadding="2"<br />
! scope="col"width="100" align="center | Time<br />
! scope="col"width="700" align="center | Activity<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| '''Thursday, June 23'''<br />
|-<br />
|width="15%" | 9:00 a.m - 10:15 a.m.<br />
|Jim Crutchfield and Ryan James: Processes and Their Models<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m. <br />
|Jim Crutchfield and Ryan James: Information Theory<br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m. <br />
|Jim Crutchfield and Ryan James: Information in Processes<br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|- <br />
|3:15 p.m. - 4:30 p.m. <br />
|Jim Crutchfield and Ryan James: Memory in Processes<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Friday, June 24 </b><br />
<br />
|- <br />
|9:00 a.m - 10:15 a.m.<br />
|Jim Crutchfield and Ryan James: The Learning Channel<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|Jim Crutchfield and Ryan James: Causal Models<br />
|- <br />
|12:00 p.m. - 1:30 p.m.<br />
|Lunch<br />
<br />
|- <br />
|1:30 p.m. - 2:45 p.m.<br />
|Jim Crutchfield and Ryan James: Measures of Complexity<br />
|-<br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|-<br />
|3:00 p.m. - 4:15 p.m.<br />
|Jim Crutchfield and Ryan James: Applications<br />
|-<br />
|5:00 p.m. - 5:30 p.m. <br />
|Ginger Richardson and Barbara Kimbell, Life after Summer School<br />
|-<br />
|5:30 p.m. - 6:30 p.m. <br />
|Pizza with David Krakauer, Bridging Concepts<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Saturday, June 25 </b><br />
<br />
|-<br />
|All Day<br />
|Taos!<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Sunday, June 26 </b><br />
<br />
|-<br />
|<br />
|<br />
<br />
|}<br />
<br />
==Module 7: Emergence==<br />
<b>Organized by Simon DeDeo and James O'Dwyer</b><br />
<br />
{| border="1" cellpadding="2"<br />
! scope="col"width="100" align="center | Time<br />
! scope="col"width="700" align="center | Activity<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
||<br><br />
| <b>Monday, June 27</b><br />
<br />
|- <br />
|width="15%"| 9:00 a.m - 10:15 a.m. <br />
|Simon DeDeo and James O'Dwyer<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|Simon DeDeo and James O'Dwyer<br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m. <br />
|Luis Bettencourt<br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|- <br />
|3:00 p.m. - 4:15 p.m. <br />
|<br />
|<br />
|6:30 - 8:00 <br />
|<br />
|Tom Carter, Music and Meaning<br />
<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Tuesday, June 28</b><br />
<br />
|- <br />
||9:00 a.m - 10:15 a.m.<br />
|Simon DeDeo and James O'Dwyer<br />
|- <br />
|10:15 a.m. - 10:45 a.m.<br />
|Break<br />
|- <br />
|10:45 am - 12:00 p.m. <br />
|Simon DeDeo and James O'Dwyer<br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m. <br />
|Iain Couzin, Lecture<br />
|- <br />
|2:45 p.m. - 3:00 p.m.<br />
|Break<br />
|- <br />
|3:00 p.m. - 5:00 p.m.<br />
|Free Time<br />
|-<br />
|5:00 p.m. - 6:30 p.m.<br />
|Dinner<br />
|-<br />
|6:15 p.m. - 7:30 p.m.<br />
|George Johnson and John German, Science Writing<br />
|}<br />
<br />
==Module 8: Machine Learning==<br />
<b>Organized by Dan Rockmore</b><br />
<br />
{| border="1" cellpadding="2"<br />
! scope="col"width="100" align="center | Time<br />
! scope="col"width="700" align="center | Activity<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Wednesday, June 29</b><br />
<br />
|-<br />
|<br />
|<b>All events at SFI</b><br />
<br />
|- <br />
|8:30 a.m.<br />
|Shuttles to SFI<br />
|- <br />
|width="15%" |9:00 a.m - 10:15 a.m.<br />
|Dan Rockmore<br />
|- <br />
|10:15 a.m. - 10:45 p.m.<br />
|break <br />
|- <br />
|10:45 a.m. - 12:00 p.m.<br />
|Dan Rockmore<br />
|- <br />
|12:00 p.m. - 1:45 p.m.<br />
|Lunch<br />
|- <br />
|1:45 p.m. - 3:00 p.m.<br />
|Cosma Shalizi-Nonparametric smoothing for data analysis. Kernel smoothing for regression. Kernel density estimating. Condition densities. Prediction vs. inference.<br />
|- <br />
|3:00 p.m. - 3:30 p.m.<br />
|Tea with SFI Community<br />
<br />
|- <br />
|3:30 p.m. - 4:45 p.m. <br />
|Cosma Shalizi-Principles of statistical inference and learning. Model complexity and bias-variance trade-offs. Identification and consistency. Learning under dependence.<br />
|- <br />
|5:00 p.m. <br />
|Shuttle leaves to St. John's <br />
|- <br />
|5:00 p.m.-6:00 p.m. <br />
|Dinner <br />
|- <br />
|6:10 p.m. <br />
|Shuttle leaves to the James A. Little Theater for Iain Couzin's Public Lecture<br />
|- <br />
|6:30 p.m.-8:30 p.m. <br />
|SFI Public Lecture, Iain Couzin<br />
|- <br />
|9:00 p.m. <br />
|Shuttle leaves to St. John's <br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Thursday, June 30</b><br />
|- <br />
|9:00 a.m - 10:15 a.m. <br />
|Dan Rockmore<br />
<br />
|- <br />
|10:45 am - 12:00 p.m.<br />
|Cosma Shalizi- Model selection and model checking. Information criteria; their myths and weaknesses. Model comparison tests. Specification tests. Using statistics for science. <br />
|- <br />
|12:00 p.m. -1:30 p.m.<br />
|Lunch<br />
|- <br />
|1:30 p.m. - 2:45 p.m.<br />
|Greg Leibon<br />
<br />
|- <br />
|3:00 p.m. - 4:30 p.m. <br />
|Dan Rockmore<br />
<br />
<br />
|- bgcolor="#aaaaaa" align="center"<br />
|<br><br />
| <b>Friday, July 1: FINAL DAY</b><br />
|-<br />
|<br />
|<b>All events at SFI</b><br />
|-<br />
|8:30 a.m.<br />
|Buses Depart for SFI<br />
|- <br />
|9:00 a.m - 12:00 p.m.<br />
|Reckless Ideas / CSSS Challenge<br />
<br />
|-<br />
|1:00 p.m. - 4:00 p.m.<br />
|Farewell BBQ<br />
<br />
|- <br />
|4:30 <br />
|Shuttle Back to St. John's College <br />
|}</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Concepts:_Nonlinearity&diff=39964Module:Concepts: Nonlinearity2011-06-10T17:50:04Z<p>Chaos: </p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}} <br />
<br />
Organized by [http://www.cs.colorado.edu/~lizb/ Liz Bradley]<br />
<br />
==Readings==<br />
===Liz Bradley===<br />
[http://www.cs.colorado.edu/~lizb/na/ode-notes.pdf Numerical Solution of Differential Equations]<br> <br />
[http://www.cs.colorado.edu/~lizb/papers/ida-chapter.pdf Time Series Analysis]<br />
<br />
<br />
'''[http://www.mpipks-dresden.mpg.de/~tisean/Tisean_3.0.1/index.html TISEAN 3.0.1: Nonlinear Time Series Analysis Software]'''<br />
<br />
<br />
==Slides==<br />
===Liz Bradley===<br />
<br />
[[Media:BradleySyllabus.pdf |Syllabus]]<br />
<br />
[[Media:BradleySlides.pdf |Slides]]<br />
<br />
===Jim Crutchfield===<br />
<br />
[[Media:DynamicsLecture1.pdf |Symbolic Dynamics Lecture 1 (PDF)]]<br />
<br />
[[Media:DynamicsLecture2.pdf |Symbolic Dynamics Lecture 2 (PDF)]]<br />
<br />
==Links==<br />
===Liz Bradley===<br />
<br />
'''[http://www.exploratorium.edu/complexity/java/lorenz.html Lorenz Attractor explorer]'''<br />
<br />
'''[http://projectguts.org/files/Lorenz.nlogo NetLogo Lorenz attractor]''' (Right click - save - open with Netlogo)<br />
<br />
===Symbolic Dynamics Web-based Labs===<br />
<br />
'''[http://ockham.local:8000 CMPy Notebook]'''<br />
<br />
==Workshop Data==<br />
===Liz Bradley===<br />
<br />
'''[[Media:Lab1.pdf | Lab 1]]'''<br />
<br />
'''[[Media:Lab 2.pdf | Lab 2]]'''<br />
<br />
Here is a zip archive of data files for your use in this lab:<br />
<br />
http://tuvalu.santafe.edu/files/DAT_files.zip</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Concepts:_Nonlinearity&diff=39963Module:Concepts: Nonlinearity2011-06-10T17:49:19Z<p>Chaos: </p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}} <br />
<br />
Organized by [http://www.cs.colorado.edu/~lizb/ Liz Bradley]<br />
<br />
==Readings==<br />
===Liz Bradley===<br />
[http://www.cs.colorado.edu/~lizb/na/ode-notes.pdf Numerical Solution of Differential Equations]<br> <br />
[http://www.cs.colorado.edu/~lizb/papers/ida-chapter.pdf Time Series Analysis]<br />
<br />
<br />
'''[http://www.mpipks-dresden.mpg.de/~tisean/Tisean_3.0.1/index.html TISEAN 3.0.1: Nonlinear Time Series Analysis Software]'''<br />
<br />
<br />
==Slides==<br />
===Liz Bradley===<br />
<br />
[[Media:BradleySyllabus.pdf |Syllabus]]<br />
<br />
[[Media:BradleySlides.pdf |Slides]]<br />
<br />
===Jim Crutchfield===<br />
<br />
[[Media:DynamicsLecture1.pdf |Symbolic Dynamics Lecture 1 (Slides)]]<br />
<br />
[[Media:DynamicsLecture2.pdf |Symbolic Dynamics Lecture 2 (Slides)]]<br />
<br />
==Links==<br />
===Liz Bradley===<br />
<br />
'''[http://www.exploratorium.edu/complexity/java/lorenz.html Lorenz Attractor explorer]'''<br />
<br />
'''[http://projectguts.org/files/Lorenz.nlogo NetLogo Lorenz attractor]''' (Right click - save - open with Netlogo)<br />
<br />
===Symbolic Dynamics Web-based Labs===<br />
<br />
'''[http://ockham.local:8000 CMPy Notebook]'''<br />
<br />
==Workshop Data==<br />
===Liz Bradley===<br />
<br />
'''[[Media:Lab1.pdf | Lab 1]]'''<br />
<br />
'''[[Media:Lab 2.pdf | Lab 2]]'''<br />
<br />
Here is a zip archive of data files for your use in this lab:<br />
<br />
http://tuvalu.santafe.edu/files/DAT_files.zip</div>Chaoshttps://wiki.santafe.edu/index.php?title=File:DynamicsLecture1.pdf&diff=39962File:DynamicsLecture1.pdf2011-06-10T17:47:51Z<p>Chaos: uploaded a new version of "File:DynamicsLecture1.pdf":&#32;PDF of Crutchfield first lecture on symbolic dynamics</p>
<hr />
<div>Crutchfield first lecture in CSSS Dynamics Module on Symbolic Dynamics.</div>Chaoshttps://wiki.santafe.edu/index.php?title=File:DynamicsLecture2.pdf&diff=39961File:DynamicsLecture2.pdf2011-06-10T17:46:29Z<p>Chaos: PDF of slides for Crutchfield lectures on symbolic dynamics</p>
<hr />
<div>PDF of slides for Crutchfield lectures on symbolic dynamics</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Concepts:_Nonlinearity&diff=39959Module:Concepts: Nonlinearity2011-06-10T17:45:49Z<p>Chaos: </p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}} <br />
<br />
Organized by [http://www.cs.colorado.edu/~lizb/ Liz Bradley]<br />
<br />
==Readings==<br />
===Liz Bradley===<br />
[http://www.cs.colorado.edu/~lizb/na/ode-notes.pdf Numerical Solution of Differential Equations]<br> <br />
[http://www.cs.colorado.edu/~lizb/papers/ida-chapter.pdf Time Series Analysis]<br />
<br />
<br />
'''[http://www.mpipks-dresden.mpg.de/~tisean/Tisean_3.0.1/index.html TISEAN 3.0.1: Nonlinear Time Series Analysis Software]'''<br />
<br />
<br />
==Slides==<br />
===Liz Bradley===<br />
<br />
[[Media:BradleySyllabus.pdf |Syllabus]]<br />
<br />
[[Media:BradleySlides.pdf |Slides]]<br />
<br />
===Jim Crutchfield===<br />
<br />
[[Media:DynamicsLecture1.pdf |Symbolic Dynamics Lecture 1 (Slides)]]<br />
<br />
[[Media:DynamicsLecture2.pdf |Symbolic Dynamics Lecture 2 (Slides)]]<br />
<br />
==Links==<br />
===Liz Bradley===<br />
<br />
'''[http://www.exploratorium.edu/complexity/java/lorenz.html Lorenz Attractor explorer]'''<br />
<br />
'''[http://projectguts.org/files/Lorenz.nlogo NetLogo Lorenz attractor]''' (Right click - save - open with Netlogo)<br />
<br />
===Jim Crutchfield===<br />
<br />
'''[http://ockham.local:8000 CMPy Notebook]'''<br />
<br />
==Workshop Data==<br />
===Liz Bradley===<br />
<br />
'''[[Media:Lab1.pdf | Lab 1]]'''<br />
<br />
'''[[Media:Lab 2.pdf | Lab 2]]'''<br />
<br />
Here is a zip archive of data files for your use in this lab:<br />
<br />
http://tuvalu.santafe.edu/files/DAT_files.zip</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Concepts:_Nonlinearity&diff=39896Module:Concepts: Nonlinearity2011-06-09T18:06:48Z<p>Chaos: </p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}} <br />
<br />
Organized by [http://www.cs.colorado.edu/~lizb/ Liz Bradley]<br />
<br />
==Readings==<br />
===Liz Bradley===<br />
[http://www.cs.colorado.edu/~lizb/na/ode-notes.pdf Numerical Solution of Differential Equations]<br> <br />
[http://www.cs.colorado.edu/~lizb/papers/ida-chapter.pdf Time Series Analysis]<br />
<br />
<br />
'''[http://www.mpipks-dresden.mpg.de/~tisean/Tisean_3.0.1/index.html TISEAN 3.0.1: Nonlinear Time Series Analysis Software]'''<br />
<br />
'''[[Media:Lab1.pdf | Logistic Map lab]]'''<br />
'''[[Media:Lab 2.pdf | Lorenz System Map lab]]'''<br />
<br />
<br />
==Slides==<br />
===Liz Bradley===<br />
<br />
[[Media:BradleySyllabus.pdf |Syllabus]]<br />
<br />
[[Media:BradleySlides.pdf |Slides]]<br />
<br />
==Links==<br />
===Liz Bradley===<br />
<br />
'''[http://www.exploratorium.edu/complexity/java/lorenz.html Lorenz Attractor explorer]'''<br />
<br />
'''[http://projectguts.org/files/Lorenz.nlogo NetLogo Lorenz attractor]''' (Right click - save - open with Netlogo)<br />
<br />
==Workshop Data==<br />
===Liz Bradley===<br />
<br />
Please download this zip package of data files:<br />
<br />
http://tuvalu.santafe.edu/files/DAT_files.zip<br />
<br />
==Slides==<br />
===Jim Crutchfield===<br />
<br />
[[Media:DynamicsLecture1.pdf |Symbolic Dynamics Lecture 1 (Slides)]]</div>Chaoshttps://wiki.santafe.edu/index.php?title=File:DynamicsLecture1.pdf&diff=39895File:DynamicsLecture1.pdf2011-06-09T18:03:53Z<p>Chaos: Crutchfield first lecture in CSSS Dynamics Module on Symbolic Dynamics.</p>
<hr />
<div>Crutchfield first lecture in CSSS Dynamics Module on Symbolic Dynamics.</div>Chaoshttps://wiki.santafe.edu/index.php?title=Module:Concepts:_Nonlinearity&diff=39894Module:Concepts: Nonlinearity2011-06-09T18:03:11Z<p>Chaos: </p>
<hr />
<div>{{Complex Systems Summer School 2011 Modules}} <br />
<br />
Organized by [http://www.cs.colorado.edu/~lizb/ Liz Bradley]<br />
<br />
==Readings==<br />
===Liz Bradley===<br />
[http://www.cs.colorado.edu/~lizb/na/ode-notes.pdf Numerical Solution of Differential Equations]<br> <br />
[http://www.cs.colorado.edu/~lizb/papers/ida-chapter.pdf Time Series Analysis]<br />
<br />
<br />
'''[http://www.mpipks-dresden.mpg.de/~tisean/Tisean_3.0.1/index.html TISEAN 3.0.1: Nonlinear Time Series Analysis Software]'''<br />
<br />
'''[[Media:Lab1.pdf | Logistic Map lab]]'''<br />
'''[[Media:Lab 2.pdf | Lorenz System Map lab]]'''<br />
<br />
<br />
==Slides==<br />
===Liz Bradley===<br />
<br />
[[Media:BradleySyllabus.pdf |Syllabus]]<br />
<br />
[[Media:BradleySlides.pdf |Slides]]<br />
<br />
==Links==<br />
===Liz Bradley===<br />
<br />
'''[http://www.exploratorium.edu/complexity/java/lorenz.html Lorenz Attractor explorer]'''<br />
<br />
'''[http://projectguts.org/files/Lorenz.nlogo NetLogo Lorenz attractor]''' (Right click - save - open with Netlogo)<br />
<br />
==Workshop Data==<br />
===Liz Bradley===<br />
<br />
Please download this zip package of data files:<br />
<br />
http://tuvalu.santafe.edu/files/DAT_files.zip<br />
<br />
==Slides==<br />
===Jim Crutchfield===<br />
<br />
[[Media:DynamicsLecture1.pdf |Slides]]</div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=39050Randomness, Structure and Causality - Abstracts2011-01-22T20:20:16Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''A Geometric Approach to Complexity'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
I discuss several complexity measures of random fields from a geometric perspective. Central to this approach is the notion of multi-information, a generalization of mutual information. As<br />
demonstrated by Amari, information geometry allows to decompose this measure in a natural way. In my talk I will show how this decomposition leads to a unifying scheme of various approaches to complexity. In particular, connections to the complexity measure of Tononi, Sporns, and Edelman and also to excess entropy (predictive information) can be established. In the second part of my talk, the interplay between complexity and causality (causality in Pearl's sense) will be discussed. A generalization of Reichenbach's common cause principle will play a central role in this regard.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Information Aggregation in Correlated Complex Systems and Optimal Estimation'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
Information is a peculiar quantity. Unlike matter and energy, which are conserved by the laws of physics, the aggregation of knowledge from many sources can in fact produce more information (synergy) or less (redundancy) than the sum of its parts, provided these sources are correlated. I discuss how the formal properties of information aggregation - expressed in information theoretic terms - provide a general window for explaining features of organization in several complex systems. I show under what circumstances collective coordination may pay off in stochastic search problems, how this can be used to estimate functional relations between neurons in living neural tissue and more generally how it may have implications for other network structures in social and biological systems.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity''' [[Media:CrutchfieldTalkSlides.pdf|PDF]]<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Introduction to the Workshop''' [[Media:MachtaWorkshopIntro.pdf|PDF]]<br />
<br><br />
<br><br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
In this talk I argue that a fundamental measure of physical complexity is obtained from the parallel computational complexity of sampling states of the system. After motivating this idea, I will briefly review relevant aspects of computational complexity theory, discuss the properties of the proposed measure of physical complexity and illustrate the ideas with some examples from statistical physics. <br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Dominos, Ergodic Flows'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We present a model, developed with Norman Packard, of a simple discrete open flow system. Dimers are created at one edge of a two-dimensional lattice, diffuse across, and are removed at the opposite side. A steady-state flow is established, under various kinetic rules. In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem, whose entropy as a function of density is known. This entropy density is reproduced locally in the flow system, as shown by statistics over local templates. The goal is to clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity''' [[Media:VilelaMendezTalksSlides.pdf|PDF]]<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk) [[Media:WiesnerTalkSlides.pdf|PDF]]<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=39024Randomness, Structure and Causality - Abstracts2011-01-10T01:06:21Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''A Geometric Approach to Complexity'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
discuss several complexity measures of random fields from a geometric perspective. Central to this approach is the notion of multi-information, a generalization of mutual information. As<br />
demonstrated by Amari, information geometry allows to decompose this measure in a natural way. In my talk I will show how this decomposition leads to a unifying scheme of various approaches to complexity. In particular, connections to the complexity measure of Tononi, Sporns, and Edelman and also to excess entropy (predictive information) can be established. In the second part of my talk, the interplay between complexity and causality (causality in Pearl's sense) will be discussed. A generalization of Reichenbach's common cause principle will play a central role in this regard.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Information Aggregation in Correlated Complex Systems and Optimal Estimation'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
Information is a peculiar quantity. Unlike matter and energy, which are conserved by the laws of physics, the aggregation of knowledge from many sources can in fact produce more information (synergy) or less (redundancy) than the sum of its parts, provided these sources are correlated. I discuss how the formal properties of information aggregation - expressed in information theoretic terms - provide a general window for explaining features of organization in several complex systems. I show under what circumstances collective coordination may pay off in stochastic search problems, how this can be used to estimate functional relations between neurons in living neural tissue and more generally how it may have implications for other network structures in social and biological systems.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity''' [[Media:CrutchfieldTalkSlides.pdf|PDF]]<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Introduction to the Workshop''' [[Media:MachtaWorkshopIntro.pdf|PDF]]<br />
<br><br />
<br><br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
In this talk I argue that a fundamental measure of physical complexity is obtained from the parallel computational complexity of sampling states of the system. After motivating this idea, I will briefly review relevant aspects of computational complexity theory, discuss the properties of the proposed measure of physical complexity and illustrate the ideas with some examples from statistical physics. <br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Dominos, Ergodic Flows'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We present a model, developed with Norman Packard, of a simple discrete open flow system. Dimers are created at one edge of a two-dimensional lattice, diffuse across, and are removed at the opposite side. A steady-state flow is established, under various kinetic rules. In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem, whose entropy as a function of density is known. This entropy density is reproduced locally in the flow system, as shown by statistics over local templates. The goal is to clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity''' [[Media:VilelaMendezTalksSlides.pdf|PDF]]<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk) [[Media:WiesnerTalkSlides.pdf|PDF]]<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Chaoshttps://wiki.santafe.edu/index.php?title=File:WiesnerTalkSlides.pdf&diff=39023File:WiesnerTalkSlides.pdf2011-01-10T01:05:45Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=39022Randomness, Structure and Causality - Abstracts2011-01-10T01:04:59Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''A Geometric Approach to Complexity'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
discuss several complexity measures of random fields from a geometric perspective. Central to this approach is the notion of multi-information, a generalization of mutual information. As<br />
demonstrated by Amari, information geometry allows to decompose this measure in a natural way. In my talk I will show how this decomposition leads to a unifying scheme of various approaches to complexity. In particular, connections to the complexity measure of Tononi, Sporns, and Edelman and also to excess entropy (predictive information) can be established. In the second part of my talk, the interplay between complexity and causality (causality in Pearl's sense) will be discussed. A generalization of Reichenbach's common cause principle will play a central role in this regard.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Information Aggregation in Correlated Complex Systems and Optimal Estimation'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
Information is a peculiar quantity. Unlike matter and energy, which are conserved by the laws of physics, the aggregation of knowledge from many sources can in fact produce more information (synergy) or less (redundancy) than the sum of its parts, provided these sources are correlated. I discuss how the formal properties of information aggregation - expressed in information theoretic terms - provide a general window for explaining features of organization in several complex systems. I show under what circumstances collective coordination may pay off in stochastic search problems, how this can be used to estimate functional relations between neurons in living neural tissue and more generally how it may have implications for other network structures in social and biological systems.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity''' [[Media:CrutchfieldTalkSlides.pdf|PDF]]<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Introduction to the Workshop''' [[Media:MachtaWorkshopIntro.pdf|PDF]]<br />
<br><br />
<br><br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
In this talk I argue that a fundamental measure of physical complexity is obtained from the parallel computational complexity of sampling states of the system. After motivating this idea, I will briefly review relevant aspects of computational complexity theory, discuss the properties of the proposed measure of physical complexity and illustrate the ideas with some examples from statistical physics. <br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Dominos, Ergodic Flows'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We present a model, developed with Norman Packard, of a simple discrete open flow system. Dimers are created at one edge of a two-dimensional lattice, diffuse across, and are removed at the opposite side. A steady-state flow is established, under various kinetic rules. In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem, whose entropy as a function of density is known. This entropy density is reproduced locally in the flow system, as shown by statistics over local templates. The goal is to clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity''' [[Media:VilelaMendezTalksSlides.pdf|PDF]]<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk)<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Chaoshttps://wiki.santafe.edu/index.php?title=File:VilelaMendezTalksSlides.pdf&diff=39021File:VilelaMendezTalksSlides.pdf2011-01-10T01:03:39Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=39020Randomness, Structure and Causality - Abstracts2011-01-10T00:54:16Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''A Geometric Approach to Complexity'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
discuss several complexity measures of random fields from a geometric perspective. Central to this approach is the notion of multi-information, a generalization of mutual information. As<br />
demonstrated by Amari, information geometry allows to decompose this measure in a natural way. In my talk I will show how this decomposition leads to a unifying scheme of various approaches to complexity. In particular, connections to the complexity measure of Tononi, Sporns, and Edelman and also to excess entropy (predictive information) can be established. In the second part of my talk, the interplay between complexity and causality (causality in Pearl's sense) will be discussed. A generalization of Reichenbach's common cause principle will play a central role in this regard.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Information Aggregation in Correlated Complex Systems and Optimal Estimation'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
Information is a peculiar quantity. Unlike matter and energy, which are conserved by the laws of physics, the aggregation of knowledge from many sources can in fact produce more information (synergy) or less (redundancy) than the sum of its parts, provided these sources are correlated. I discuss how the formal properties of information aggregation - expressed in information theoretic terms - provide a general window for explaining features of organization in several complex systems. I show under what circumstances collective coordination may pay off in stochastic search problems, how this can be used to estimate functional relations between neurons in living neural tissue and more generally how it may have implications for other network structures in social and biological systems.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity''' [[Media:CrutchfieldTalkSlides.pdf|PDF]]<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Introduction to the Workshop''' [[Media:MachtaWorkshopIntro.pdf|PDF]]<br />
<br><br />
<br><br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
In this talk I argue that a fundamental measure of physical complexity is obtained from the parallel computational complexity of sampling states of the system. After motivating this idea, I will briefly review relevant aspects of computational complexity theory, discuss the properties of the proposed measure of physical complexity and illustrate the ideas with some examples from statistical physics. <br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Dominos, Ergodic Flows'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We present a model, developed with Norman Packard, of a simple discrete open flow system. Dimers are created at one edge of a two-dimensional lattice, diffuse across, and are removed at the opposite side. A steady-state flow is established, under various kinetic rules. In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem, whose entropy as a function of density is known. This entropy density is reproduced locally in the flow system, as shown by statistics over local templates. The goal is to clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity'''<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk)<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Chaoshttps://wiki.santafe.edu/index.php?title=File:CrutchfieldTalkSlides.pdf&diff=39019File:CrutchfieldTalkSlides.pdf2011-01-10T00:53:27Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:MachtaWorkshopIntro.pdf&diff=39018File:MachtaWorkshopIntro.pdf2011-01-10T00:49:48Z<p>Chaos: </p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=39017Randomness, Structure and Causality - Abstracts2011-01-10T00:49:26Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''A Geometric Approach to Complexity'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
discuss several complexity measures of random fields from a geometric perspective. Central to this approach is the notion of multi-information, a generalization of mutual information. As<br />
demonstrated by Amari, information geometry allows to decompose this measure in a natural way. In my talk I will show how this decomposition leads to a unifying scheme of various approaches to complexity. In particular, connections to the complexity measure of Tononi, Sporns, and Edelman and also to excess entropy (predictive information) can be established. In the second part of my talk, the interplay between complexity and causality (causality in Pearl's sense) will be discussed. A generalization of Reichenbach's common cause principle will play a central role in this regard.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Information Aggregation in Correlated Complex Systems and Optimal Estimation'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
Information is a peculiar quantity. Unlike matter and energy, which are conserved by the laws of physics, the aggregation of knowledge from many sources can in fact produce more information (synergy) or less (redundancy) than the sum of its parts, provided these sources are correlated. I discuss how the formal properties of information aggregation - expressed in information theoretic terms - provide a general window for explaining features of organization in several complex systems. I show under what circumstances collective coordination may pay off in stochastic search problems, how this can be used to estimate functional relations between neurons in living neural tissue and more generally how it may have implications for other network structures in social and biological systems.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity'''<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Introduction to the Workshop''' [[Media:MachtaWorkshopIntro.pdf|PDF]]<br />
<br><br />
<br><br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
In this talk I argue that a fundamental measure of physical complexity is obtained from the parallel computational complexity of sampling states of the system. After motivating this idea, I will briefly review relevant aspects of computational complexity theory, discuss the properties of the proposed measure of physical complexity and illustrate the ideas with some examples from statistical physics. <br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Dominos, Ergodic Flows'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We present a model, developed with Norman Packard, of a simple discrete open flow system. Dimers are created at one edge of a two-dimensional lattice, diffuse across, and are removed at the opposite side. A steady-state flow is established, under various kinetic rules. In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem, whose entropy as a function of density is known. This entropy density is reproduced locally in the flow system, as shown by statistics over local templates. The goal is to clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity'''<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk)<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Participants&diff=39016Randomness, Structure and Causality - Participants2011-01-09T00:49:22Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<table border="1"><br />
<tr><br />
<th>Name</th><br />
<th>Email</th><br />
<th>Institution</th><br />
<th>Talk</th><br />
<th>Paper</th><br />
</tr><br />
<tr><br />
<td>*Ay, Nihat</td><br />
<td>nay@mis.mpg.de</td> <br />
<td>SFI & Max Planck Institute</td><br />
<td>A Geometric Approach to Complexity</td><br />
<td>[[http://arxiv.org/abs/1001.2686]]</td><br />
</tr><br />
<tr><br />
<td>*Bell, Tony</td><br />
<td>tony@salk.edu</td><br />
<td>UC Berkeley</td><br />
<td>Learning Out of Equilibrium</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Bettencourt, Luis</td><br />
<td>lmbettencourt@gmail.com</td><br />
<td>SFI & LANL</td><br />
<td>Information Aggregation in Correlated Complex Systems and Optimal Estimation</td><br />
<td>[[http://arxiv.org/abs/0712.2218]]</td><br />
</tr><br />
<tr><br />
<td>*Chaitin, Gregory</td><br />
<td>gjchaitin@gmail.com</td><br />
<td>IBM Watson Research Center</td><br />
<td>To a Mathematical Theory of Evolution and Biological Creativity</td><br />
<td>[[Media:Darwin.pdf| Paper]]</td><br />
</tr><br />
<tr> <br />
<td>*Crutchfield, James</td><br />
<td>chaos@cse.ucdavis.edu</td><br />
<td>SFI & UC Davis</td><br />
<td>Framing Complexity</td><br />
<td>[[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]</td><br />
</tr><br />
<tr><br />
<td>*Debowski, Lukasz</td><br />
<td>ldebowsk@ipipan.waw.pl</td><br />
<td>Polish Academy of Sciences</td><br />
<td>The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts</td><br />
<td>[[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]</td><br />
</tr><br />
<tr><br />
<td>*Ellison, Christopher</td><br />
<td>cellison@cse.ucdavis.edu</td><br />
<td>UC Davis</td><br />
<td>Prediction, Retrodiction, and the Amount of Information Stored in the Present</td><br />
<td>[[http://arxiv.org/abs/0905.3587]]</td><br />
</tr><br />
<tr><br />
<td>*Feldman, David</td><br />
<td>dave@hornacek.coa.edu</td><br />
<td>College of the Atlantic</td><br />
<td>Complexity Measures and Frustration</td><br />
<td>[[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]] </td><br />
</tr><br />
<tr><br />
<td>*Mahoney, John</td><br />
<td>jmahoney3@ucmerced.edu</td><br />
<td>UC Merced</td><br />
<td>Crypticity and Information Accessibility</td><br />
<td>[[http://arxiv.org/abs/0905.4787]]</td><br />
</tr><br />
<tr><br />
<td>*Machta, Jon</td><br />
<td>machta@physics.umass.edu</td><br />
<td>SFI & University of Massachusetts</td><br />
<td>Complexity, Parallel Computation and Statistical Physics</td><br />
<td>[[http://arxiv.org/abs/cond-mat/0510809]]</td><br />
</tr><br />
<tr><br />
<td>*Mitchell, Melanie</td><br />
<td>mm@cs.pdx.edu</td><br />
<td>SFI & Portland State University</td><br />
<td>Automatic Identification of Information-Processing Structures in Cellular Automata</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Moore, Cris</td><br />
<td>moore@cs.unm</td><br />
<td>SFI & University of New Mexico</td><br />
<td>Phase Transitions and Computational Complexity</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Shaw, Robert Stetson</td><br />
<td>rob@protolife.net</td><br />
<td>ProtoLife, Inc.</td><br />
<td>Dominos, Ergodic Flows</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Still, Suzanne</td><br />
<td>sstill@hawaii.edu</td><br />
<td>University of Hawaii at Manoa</td><br />
<td>Statistical Mechanics of Interactive Learning</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Vilela-Mendes, Rui</td><br />
<td>vilela@cii.fc.ul.pt</td><br />
<td>University of Lisbon</td><br />
<td>Ergodic Parameters and Dynamical Complexity</td><br />
<td>[[http://arxiv.org/abs/1008.2664]]</td><br />
</tr><br />
<tr><br />
<td>Trabesinger, Andreas</td><br />
<td>a.trabesinger@nature.com</td><br />
<td>Nature Physics Magazine</td><br />
<td>Measuring Complexity?</td><br />
<td>[[http://www.nature.com/nphys/]]</td><br />
</tr><br />
<tr><br />
<td>*Wiesner, Karoline</td><br />
<td>k.wiesner@bristol.ac.uk</td><br />
<td>University of Bristol</td><br />
<td>Hidden Quantum Markov Models and Non-adaptive Read-out of Many-body States</td><br />
<td>[[http://arxiv.org/abs/1002.2337]]</td><br />
</tr><br />
</table><br />
<nowiki>*Confirmed</nowiki></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:Agenda.pdf&diff=39015File:Agenda.pdf2011-01-09T00:48:50Z<p>Chaos: uploaded a new version of "File:Agenda.pdf":&#32;Agenda</p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=39014Randomness, Structure and Causality - Abstracts2011-01-09T00:48:28Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''A Geometric Approach to Complexity'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
discuss several complexity measures of random fields from a geometric perspective. Central to this approach is the notion of multi-information, a generalization of mutual information. As<br />
demonstrated by Amari, information geometry allows to decompose this measure in a natural way. In my talk I will show how this decomposition leads to a unifying scheme of various approaches to complexity. In particular, connections to the complexity measure of Tononi, Sporns, and Edelman and also to excess entropy (predictive information) can be established. In the second part of my talk, the interplay between complexity and causality (causality in Pearl's sense) will be discussed. A generalization of Reichenbach's common cause principle will play a central role in this regard.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Information Aggregation in Correlated Complex Systems and Optimal Estimation'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
Information is a peculiar quantity. Unlike matter and energy, which are conserved by the laws of physics, the aggregation of knowledge from many sources can in fact produce more information (synergy) or less (redundancy) than the sum of its parts, provided these sources are correlated. I discuss how the formal properties of information aggregation - expressed in information theoretic terms - provide a general window for explaining features of organization in several complex systems. I show under what circumstances collective coordination may pay off in stochastic search problems, how this can be used to estimate functional relations between neurons in living neural tissue and more generally how it may have implications for other network structures in social and biological systems.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity'''<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
In this talk I argue that a fundamental measure of physical complexity is obtained from the parallel computational complexity of sampling states of the system. After motivating this idea, I will briefly review relevant aspects of computational complexity theory, discuss the properties of the proposed measure of physical complexity and illustrate the ideas with some examples from statistical physics. <br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Dominos, Ergodic Flows'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We present a model, developed with Norman Packard, of a simple discrete open flow system. Dimers are created at one edge of a two-dimensional lattice, diffuse across, and are removed at the opposite side. A steady-state flow is established, under various kinetic rules. In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem, whose entropy as a function of density is known. This entropy density is reproduced locally in the flow system, as shown by statistics over local templates. The goal is to clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity'''<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk)<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Chaoshttps://wiki.santafe.edu/index.php?title=File:Agenda.pdf&diff=39013File:Agenda.pdf2011-01-07T21:09:27Z<p>Chaos: uploaded a new version of "File:Agenda.pdf":&#32;Agenda</p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:Agenda.pdf&diff=39012File:Agenda.pdf2011-01-07T19:19:39Z<p>Chaos: uploaded a new version of "File:Agenda.pdf":&#32;Agenda</p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:Agenda.pdf&diff=39011File:Agenda.pdf2011-01-07T18:42:51Z<p>Chaos: uploaded a new version of "File:Agenda.pdf":&#32;Agenda</p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Bios&diff=39008Randomness, Structure and Causality - Bios2011-01-07T01:06:46Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br><br />
<br><br />
----<br />
[[Image:Ay.jpg|200px|left]]<br />
'''Nihat Ay''', Associate Professor of Mathematics, University of Leipzig<br />
<br />
Nihat studied mathematics and physics at the Ruhr University Bochum and received my Ph.D. in mathematics from the University of Leipzig in 2001. In 2003 and 2004 he was a postdoctoral fellow at the Santa Fe Institute and at the Redwood Neuroscience Institute (now the Redwood Center for Theoretical Neuroscience at UC Berkeley). After my postdoctoral stay in the USA he became a member of the Mathematical Institute of the Friedrich Alexander University in Erlangen at the assistant professor level. Since September 2005 he worked as Max Planck Resarch Group Leader at the Max Planck Institute for Mathematics in the Sciences in Leipzig where he is heading the group Information Theory of Cognitive Systems. As external professor of the Santa Fe Institute he was involved in research on complexity and robustness theory. Since September 2009 he has been affiliated with the University of Leipzig as associate professor (Privatdozent) for mathematics.<br />
<br><br />
<br><br />
----<br />
[[Image:Bell.jpg|200px|left]]<br />
'''Anthony Bell''', Research Scientist, Redwood Center for Theoretical Neuroscience, UC Berkeley<br />
<br />
Tony's long-term scientific goal is to work out how the brain learns (self-organises). This has taken him in directions of Information Theory and probability theory for neural networks. This provides a hopelessly crude and impoverished model (called redundancy reduction) of what the brain does and how it lives in its world. Unfortunately, it's the best we have at the moment. We have to do some new mathematics before we reach self-organisational principles that will apply to the physical substrate of the brain, which is molecular: ion channels, enzyme complexes, gene expression networks. We have to think about dynamics, loops, open systems, how open dynamical systems can encode and effect the spatio-temporal trajectories of their perturbing inputs.<br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:Bettencourt.jpg|200px|left]]<br />
'''Luis Bettencourt''', External Professor, Santa Fe Institute; Staff Researcher, Theoretical Division, LANL<br />
<br />
Luís M. A Bettencourt carries research in the structure and dynamics of several complex systems, with an emphasis on dynamical problems in biology and society. Currently <br />
he works on information processing in neural systems, information theoretic optimization in collective behavior, urban organization and dynamics, and the development of science and technology. Luis obtained his PhD from Imperial College, University of London for work on critical phenomena in the early Universe, and associated mathematical techniques of Statistical Physics, Field Theory and Non-linear Dynamics. He held postdoctoral positions at the University of Heidelberg, Germany, as a Director’s Fellow in the Theoretical Division at LANL, and at the Center for Theoretical Physics at MIT. In 2000 he was awarded the distinguished Slansky Fellowship at Los Alamos National Laboratory for excellence in interdisciplinary research. He has been a scientist at LANL since the spring of 2003, first at the Computer and Computational Sciences Division (CCS), and since September 2005 in the Theoretical Division (T-5: Mathematical Modeling and Analysis). He is also External Professor at the Santa Fe Institute.<br />
<br><br />
<br><br />
<br />
----<br />
[[Image:Chaitin.jpg|200px|left]]<br />
'''Gregory Chaitin''', IBM Thomas J. Watson Research Center and Computer Science, University of Maine<br />
<br />
Greg is at the IBM Watson Research Center in New York. In the mid 1960s, when he was a teenager, he created algorithmic information theory (AIT), which combines, among other elements, Shannon's information theory and Turing's theory of computability. In the four decades since then he has been the principal architect of the theory. Among his contributions are the definition of a random sequence via algorithmic incompressibility, his information-theoretic approach to Gödel's incompleteness theorem, and the celebrated number <math>Omega</math>. His work on Hilbert's 10th problem has shown that in a sense there is randomness in arithmetic, in other words, that God not only plays dice in quantum mechanics and nonlinear dynamics, but even in elementary number theory. His latest achievements have been to transform AIT into a theory about the size of real computer programs, programs that you can actually run, and his recent discovery that Leibniz anticipated AIT (1686). He is the author of nine books: Algorithmic Information Theory published by Cambridge University Press;Information, Randomness & Incompleteness and Information-Theoretic Incompleteness, both published by World Scientific; The Limits of Mathematics, The Unknowable, Exploring Randomness and Conversations with a Mathematician, all published by Springer-Verlag; From Philosophy to Program Size, published by the Tallinn Institute of Cybernetics; and Meta Math!, published by Pantheon Books. In 1995 he was given the degree of doctor of sciencehonoris causa by the University of Maine. In 2002 he was given the title of honorary professor by the University of Buenos Aires. In 2004 he was elected a corresponding member of the Académie Internationale de Philosophie des Sciences. He is also a visiting professor at the Computer Science Department of the University of Auckland, and on the international committee of the Valparaíso Complex Systems Institute.<br />
<br><br />
<br><br />
<br />
----<br />
[[Image:JPC_2007.jpg|200px|left]]<br />
'''James P. Crutchfield''', Professor of Physics and Director, Complexity Sciences Center, Physics Department, University of California at Davis<br />
<br />
Jim is Professor of Physics at the University of California,<br />
Davis, and Director of the Complexity Sciences Center&mdash;a new research and<br />
graduate program. Prior to this he was Research Professor at the Santa Fe<br />
Institute for many years, where he led its Dynamics of Learning Group<br />
and Network Dynamics Program. In parallel, he was Adjunct Professor<br />
of Physics in the Physics Department, University of New Mexico, Albuquerque.<br />
Before coming to SFI in 1997, he was a Research Physicist in the Physics<br />
Department at the University of California, Berkeley, since 1985. He received<br />
his B.A. summa cum laude in Physics and Mathematics from the University of<br />
California, Santa Cruz, in 1979 and his Ph.D. in Physics there in 1983.<br />
He has been a Visiting Research Professor at the Sloan Center for Theoretical<br />
Neurobiology, University of California, San Francisco; a Post-doctoral<br />
Fellow of the Miller Institute for Basic Research in Science at UCB; a<br />
UCB Physics Department IBM Post-Doctoral Fellow in Condensed Matter<br />
Physics; a Distinguished Visiting Research Professor of the Beckman<br />
Institute at the University of Illinois, Urbana-Champaign; and a Bernard<br />
Osher Fellow at the San Francisco Exploratorium. He is co-founder and Vice President of the<br />
Art and Science Laboratory in Santa Fe [[http://artscilab.com]].<br />
<br />
Over the last three decades Jim has worked in the areas of<br />
nonlinear dynamics, solid-state physics, astrophysics, fluid mechanics,<br />
critical phenomena and phase transitions, chaos, and pattern formation.<br />
His current research interests center on computational mechanics, the<br />
physics of complexity, statistical inference for nonlinear processes,<br />
genetic algorithms, evolutionary theory, machine learning, quantum dynamics,<br />
and distributed intelligence. He has published over 110 papers in these<br />
areas, including the following recent, related publications. Most are<br />
available from his website: [[http://cse.ucdavis.edu/~chaos]].<br />
<br />
<br />
<br /><br />
----<br />
[[Image:Debowski.jpg|200px|left]]<br />
'''Lukasz Debowski''', Research Scientist, Institute of Computer Science, Polish Academy of Sciences<br />
<br />
Lukasz's research interests revolve around probability, language, information, and learning.<br />
<br />
Lukasz works at the IPI PAN, with the Statistical Analysis and Modeling and partly with the Linguistic Engineering. Seeking big intellectual adventures, he first studied at the Faculty of Physics, University of Warsaw. Later, he also visited the UFAL, the Santa Fe Institute, the CSE UNSW, and the CWI. Many interesting people showed him strikingly different ideas about what is worth doing in alpha and beta sciences, in engineering, and in general. ``I slowly realize what I should and can do best myself.''<br />
<br />
<br><br />
<br><br />
----<br />
[[Image:Ellison.jpg|200px|left]]<br />
'''Christopher Ellison''', Graduate Student, Complexity Sciences Center, Physics Department, University of California at Davis<br />
<br><br />
<br><br />
----<br />
[[Image:dpf.jpg|200px|left]]<br />
'''David Feldman''', Professor, Physics and Astronomy, College of the Atlantic; Co-Director, SFI Complex Systems Summer School, Beijing<br />
<br />
Dave's research training is in theoretical physics and mathematics, and his research interests lie in the fields of statistical mechanics and nonlinear dynamics. In particular, his research has examined how one might measure "complexity" or pattern in a mathematical system, and how such complexity is related to disorder. This work can be loosely categorized as belonging to the constellation of research topics often referred to as "chaos and complex systems." In his research, Dave uses both analytic and computational techniques. Dave has authored research papers in journals including Physical Review E, Chaos, Physics Letters A, and Advances in Complex Systems. <br />
<br />
As a graduate student at UC-Davis, Dave received several awards in recognition of both teaching and scholarship: The Dissertation Year Fellowship; The Chancellor's Teaching Fellowship; and he was nominated for the Outstanding Graduate Student Teaching Award. Dave joined the faculty at College of the Atlantic in 1998, where he teaches a wide range of physics and math courses. He also teaches classes that explore connections between science and politics, such as Making the Bomb (about the Manhattan project and atomic weapons), and Gender and Science. <br />
<br />
<br /><br />
----<br />
[[Image:Machta.jpg|200px|left]]<br />
'''Jon Machta''', Professor of Physics, University of Massachusetts at Amherst<br />
Jon's research is in the area of theoretical condensed matter and statistical physics. His current research involves theoretical and computational studies of spin systems and applications of computational complexity theory to statistical physics.<br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:Mahoney.jpg|200px|left]]<br />
'''John Mahoney''', Post-doctoral Researcher, School of Natural Sciences, University of California at Merced<br />
<br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br />
----<br />
[[Image:mm2008.jpg|200px|left]]<br />
'''Melanie Mitchell''', Professor, Computer Science, Portland State University; External Professor and Science Board member, Santa Fe Institute. Melanie Mitchell received a Ph.D. in Computer Science from the University of Michigan in 1990. Since then she has held faculty or professional positions at the University of Michigan, the Santa Fe Institute, Los Alamos National Laboratory, the OGI School of Science and Engineering, and Portland State University. <br />
<br />
Melanie has served as Director of the Santa Fe Institute’s Complex Systems Summer School; at Portland State University she teaches, among other courses, Exploring Complexity in Science and Technology.<br />
<br />
Her major work is in the areas of analogical reasoning, complex systems, genetic algorithms and cellular automata, and her publications in those fields are frequently cited. She is the author of ''An Introduction to Genetic Algorithms'', a widely known introductory book published by MIT Press in 1996. Her most recent book is ''Complexity: A Guided Tour'' named by Amazon.com as one of the 10 best science books of 2009.<br />
<br /><br />
<br />
----<br />
[[Image:Moore.jpg|200px|left]]<br />
'''Cris Moore''', Professor of Computer Science, University of New Mexico<br />
<br />
Cris is a Professor in the Computer Science Department at the University of New Mexico, with a joint appointment in the Department of Physics and Astronomy. He is also a Professor at the Santa Fe Institute. Cris studies interesting things like quantum computation (especially post-quantum cryptography and the possibility of algorithms for Graph Isomorphism), phase transitions in NP-complete problems (e.g. the colorability of random graphs, or the satisfiability of random formulas) and social networks (in particular, automated techniques for identifying important structural features of large networks).<br />
<br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:Shaw.jpg|200px|left]]<br />
'''Rob Shaw''', Research Scientist, ProtoLife, Inc., Venica, Italy<br />
<br><br />
<br><br />
"Dominos, Ergodic Flows": We present a model, developed with Norman Packard, of<br />
a simple discrete open flow system. Dimers are created at one edge<br />
of a two-dimensional lattice, diffuse across, and are removed at the opposite side.<br />
A steady-state flow is established, under various kinetic rules.<br />
In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem,<br />
whose entropy as a function of density is known. This entropy density is reproduced<br />
locally in the flow system, as shown by statistics over local templates. The goal is to<br />
clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:Still.jpg|200px|left]]<br />
'''Susanne Still''', Assistant Professor of Computer Science, Department of Information and Computer Sciences, University of Hawaii at Manoa.<br />
<br />
Most research in learning theory deals with passive learning. However, many real world learning problems are interactive, and so is animal learning. <br />
<br />
The theoretical foundations for interactive learning and behavior are much less developed than those for passive learning. A theoretical understanding of behavioral learning lies at the heart of a new generation of machine intelligence, and is also at the core of many interesting questions about adaptation and learning in biology. <br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:VilelaMendes_2000.jpg|200px|left]]<br />
'''Rui Vilela-Mendes''', Professor of Mathematics, Instituto Superior Tecnico, Lisboa, Portugal.<br />
Rui received an Electrical Engineering degree from the Technical University (IST)- Lisbon, a Ph. D. in Physics from the University of Texas (Austin) and an Habilitation in Mathematics from the University of Lisbon. He is currently a member of the Center for Mathematics and Applications (CMAF-UL) and of the Institute for Plasmas and Nuclear Fusion (IPFN-IST) as well as a member of the Lisbon Academy of Sciences. He was a visiting researcher at CERN, CNRS (Marseille), IHES (Bures), Univ. of Bielefeld and co-organizer and collaborator of several international research projects on Theoretical Physics and the Sciences of Complexity.<br />
<br />
Over the last few decades Rui has worked in the areas of mathematical economics, nonlinear dynamics and control, stochastic processes and quantum theory. My current research interests center on mathematical economics, the physics of complexity, control and quantum computing.<br />
<br><br />
<br />
----<br />
[[Image:Wiesner.png|200px|left]]<br />
'''Karoline Wiesner''', Assistant Professor, School of Mathematics and Centre for Complexity Sciences, University of Bristol<br />
<br />
Bateson defines information as “a difference that makes a difference”. Complexity is “when quantitative differences become qualitative differences.” We need information theory to identify this difference. Key to my work is coming up with good measures of complexity for classical (biological) and quantum systems. The goal is to build a tool set for identifying and measuring structure. Part of this tool set is a hierarchy of classical and quantum computational architectures. How difficult it is to generate a given structure determines how high up in this architectural hierarchy its representation is found.<br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br />
----<br />
<br />
== Observers ==<br />
<br />
[[Image:Trabesinger.jpg|200px|left]]<br />
'''Andreas Trabesinger''', Senior Editor, Nature Physics<br />
<br />
Throughout his doctoral and post-doctoral studies, Andreas focused on various aspects of nuclear magnetic resonance, including application to monitoring brain metabolism and NMR at very low magnetic fields. After graduating from the physics department of ETH-Zürich in 2000, he conducted research at the Institute of Biomedical Engineering and in the Laboratory of Physical Chemistry at ETH, as well as at the Department of Chemistry at Berkeley, where he collaborated with the condensed-matter and atomic physics groups.<br />
<br><br />
<br></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:Bettencourt.jpg&diff=39007File:Bettencourt.jpg2011-01-07T01:05:13Z<p>Chaos: uploaded a new version of "File:Bettencourt.jpg"</p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=File:Agenda.pdf&diff=39006File:Agenda.pdf2011-01-07T01:03:52Z<p>Chaos: uploaded a new version of "File:Agenda.pdf":&#32;Agenda</p>
<hr />
<div></div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Bios&diff=39005Randomness, Structure and Causality - Bios2011-01-07T01:03:29Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br><br />
<br><br />
----<br />
[[Image:Ay.jpg|200px|left]]<br />
'''Nihat Ay''', Associate Professor of Mathematics, University of Leipzig<br />
<br />
Nihat studied mathematics and physics at the Ruhr University Bochum and received my Ph.D. in mathematics from the University of Leipzig in 2001. In 2003 and 2004 he was a postdoctoral fellow at the Santa Fe Institute and at the Redwood Neuroscience Institute (now the Redwood Center for Theoretical Neuroscience at UC Berkeley). After my postdoctoral stay in the USA he became a member of the Mathematical Institute of the Friedrich Alexander University in Erlangen at the assistant professor level. Since September 2005 he worked as Max Planck Resarch Group Leader at the Max Planck Institute for Mathematics in the Sciences in Leipzig where he is heading the group Information Theory of Cognitive Systems. As external professor of the Santa Fe Institute he was involved in research on complexity and robustness theory. Since September 2009 he has been affiliated with the University of Leipzig as associate professor (Privatdozent) for mathematics.<br />
<br><br />
<br><br />
----<br />
[[Image:Bell.jpg|200px|left]]<br />
'''Anthony Bell''', Research Scientist, Redwood Center for Theoretical Neuroscience, UC Berkeley<br />
<br />
Tony's long-term scientific goal is to work out how the brain learns (self-organises). This has taken him in directions of Information Theory and probability theory for neural networks. This provides a hopelessly crude and impoverished model (called redundancy reduction) of what the brain does and how it lives in its world. Unfortunately, it's the best we have at the moment. We have to do some new mathematics before we reach self-organisational principles that will apply to the physical substrate of the brain, which is molecular: ion channels, enzyme complexes, gene expression networks. We have to think about dynamics, loops, open systems, how open dynamical systems can encode and effect the spatio-temporal trajectories of their perturbing inputs.<br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:Bettencourt.jpg|200px|left]]<br />
'''Luis Bettencourt''', Research Professor, Santa Fe Institute; Staff Researcher, Theoretical Division, LANL<br />
<br />
Luís M. A Bettencourt carries research in the structure and dynamics of several complex systems, with an emphasis on dynamical problems in biology and society. Currently <br />
he works on information processing in neural systems, information theoretic optimization in collective behavior, urban organization and dynamics, and the development of science and technology. Luis obtained his PhD from Imperial College, University of London for work on critical phenomena in the early Universe, and associated mathematical techniques of Statistical Physics, Field Theory and Non-linear Dynamics. He held postdoctoral positions at the University of Heidelberg, Germany, as a Director’s Fellow in the Theoretical Division at LANL, and at the Center for Theoretical Physics at MIT. In 2000 he was awarded the distinguished Slansky Fellowship at Los Alamos National Laboratory for excellence in interdisciplinary research. He has been a scientist at LANL since the spring of 2003, first at the Computer and Computational Sciences Division (CCS), and since September 2005 in the Theoretical Division (T-5: Mathematical Modeling and Analysis). He is also External Professor at the Santa Fe Institute.<br />
<br><br />
<br><br />
<br />
----<br />
[[Image:Chaitin.jpg|200px|left]]<br />
'''Gregory Chaitin''', IBM Thomas J. Watson Research Center and Computer Science, University of Maine<br />
<br />
Greg is at the IBM Watson Research Center in New York. In the mid 1960s, when he was a teenager, he created algorithmic information theory (AIT), which combines, among other elements, Shannon's information theory and Turing's theory of computability. In the four decades since then he has been the principal architect of the theory. Among his contributions are the definition of a random sequence via algorithmic incompressibility, his information-theoretic approach to Gödel's incompleteness theorem, and the celebrated number <math>Omega</math>. His work on Hilbert's 10th problem has shown that in a sense there is randomness in arithmetic, in other words, that God not only plays dice in quantum mechanics and nonlinear dynamics, but even in elementary number theory. His latest achievements have been to transform AIT into a theory about the size of real computer programs, programs that you can actually run, and his recent discovery that Leibniz anticipated AIT (1686). He is the author of nine books: Algorithmic Information Theory published by Cambridge University Press;Information, Randomness & Incompleteness and Information-Theoretic Incompleteness, both published by World Scientific; The Limits of Mathematics, The Unknowable, Exploring Randomness and Conversations with a Mathematician, all published by Springer-Verlag; From Philosophy to Program Size, published by the Tallinn Institute of Cybernetics; and Meta Math!, published by Pantheon Books. In 1995 he was given the degree of doctor of sciencehonoris causa by the University of Maine. In 2002 he was given the title of honorary professor by the University of Buenos Aires. In 2004 he was elected a corresponding member of the Académie Internationale de Philosophie des Sciences. He is also a visiting professor at the Computer Science Department of the University of Auckland, and on the international committee of the Valparaíso Complex Systems Institute.<br />
<br><br />
<br><br />
<br />
----<br />
[[Image:JPC_2007.jpg|200px|left]]<br />
'''James P. Crutchfield''', Professor of Physics and Director, Complexity Sciences Center, Physics Department, University of California at Davis<br />
<br />
Jim is Professor of Physics at the University of California,<br />
Davis, and Director of the Complexity Sciences Center&mdash;a new research and<br />
graduate program. Prior to this he was Research Professor at the Santa Fe<br />
Institute for many years, where he led its Dynamics of Learning Group<br />
and Network Dynamics Program. In parallel, he was Adjunct Professor<br />
of Physics in the Physics Department, University of New Mexico, Albuquerque.<br />
Before coming to SFI in 1997, he was a Research Physicist in the Physics<br />
Department at the University of California, Berkeley, since 1985. He received<br />
his B.A. summa cum laude in Physics and Mathematics from the University of<br />
California, Santa Cruz, in 1979 and his Ph.D. in Physics there in 1983.<br />
He has been a Visiting Research Professor at the Sloan Center for Theoretical<br />
Neurobiology, University of California, San Francisco; a Post-doctoral<br />
Fellow of the Miller Institute for Basic Research in Science at UCB; a<br />
UCB Physics Department IBM Post-Doctoral Fellow in Condensed Matter<br />
Physics; a Distinguished Visiting Research Professor of the Beckman<br />
Institute at the University of Illinois, Urbana-Champaign; and a Bernard<br />
Osher Fellow at the San Francisco Exploratorium. He is co-founder and Vice President of the<br />
Art and Science Laboratory in Santa Fe [[http://artscilab.com]].<br />
<br />
Over the last three decades Jim has worked in the areas of<br />
nonlinear dynamics, solid-state physics, astrophysics, fluid mechanics,<br />
critical phenomena and phase transitions, chaos, and pattern formation.<br />
His current research interests center on computational mechanics, the<br />
physics of complexity, statistical inference for nonlinear processes,<br />
genetic algorithms, evolutionary theory, machine learning, quantum dynamics,<br />
and distributed intelligence. He has published over 110 papers in these<br />
areas, including the following recent, related publications. Most are<br />
available from his website: [[http://cse.ucdavis.edu/~chaos]].<br />
<br />
<br />
<br /><br />
----<br />
[[Image:Debowski.jpg|200px|left]]<br />
'''Lukasz Debowski''', Research Scientist, Institute of Computer Science, Polish Academy of Sciences<br />
<br />
Lukasz's research interests revolve around probability, language, information, and learning.<br />
<br />
Lukasz works at the IPI PAN, with the Statistical Analysis and Modeling and partly with the Linguistic Engineering. Seeking big intellectual adventures, he first studied at the Faculty of Physics, University of Warsaw. Later, he also visited the UFAL, the Santa Fe Institute, the CSE UNSW, and the CWI. Many interesting people showed him strikingly different ideas about what is worth doing in alpha and beta sciences, in engineering, and in general. ``I slowly realize what I should and can do best myself.''<br />
<br />
<br><br />
<br><br />
----<br />
[[Image:Ellison.jpg|200px|left]]<br />
'''Christopher Ellison''', Graduate Student, Complexity Sciences Center, Physics Department, University of California at Davis<br />
<br><br />
<br><br />
----<br />
[[Image:dpf.jpg|200px|left]]<br />
'''David Feldman''', Professor, Physics and Astronomy, College of the Atlantic; Co-Director, SFI Complex Systems Summer School, Beijing<br />
<br />
Dave's research training is in theoretical physics and mathematics, and his research interests lie in the fields of statistical mechanics and nonlinear dynamics. In particular, his research has examined how one might measure "complexity" or pattern in a mathematical system, and how such complexity is related to disorder. This work can be loosely categorized as belonging to the constellation of research topics often referred to as "chaos and complex systems." In his research, Dave uses both analytic and computational techniques. Dave has authored research papers in journals including Physical Review E, Chaos, Physics Letters A, and Advances in Complex Systems. <br />
<br />
As a graduate student at UC-Davis, Dave received several awards in recognition of both teaching and scholarship: The Dissertation Year Fellowship; The Chancellor's Teaching Fellowship; and he was nominated for the Outstanding Graduate Student Teaching Award. Dave joined the faculty at College of the Atlantic in 1998, where he teaches a wide range of physics and math courses. He also teaches classes that explore connections between science and politics, such as Making the Bomb (about the Manhattan project and atomic weapons), and Gender and Science. <br />
<br />
<br /><br />
----<br />
[[Image:Machta.jpg|200px|left]]<br />
'''Jon Machta''', Professor of Physics, University of Massachusetts at Amherst<br />
Jon's research is in the area of theoretical condensed matter and statistical physics. His current research involves theoretical and computational studies of spin systems and applications of computational complexity theory to statistical physics.<br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:Mahoney.jpg|200px|left]]<br />
'''John Mahoney''', Post-doctoral Researcher, School of Natural Sciences, University of California at Merced<br />
<br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br />
----<br />
[[Image:mm2008.jpg|200px|left]]<br />
'''Melanie Mitchell''', Professor, Computer Science, Portland State University; External Professor and Science Board member, Santa Fe Institute. Melanie Mitchell received a Ph.D. in Computer Science from the University of Michigan in 1990. Since then she has held faculty or professional positions at the University of Michigan, the Santa Fe Institute, Los Alamos National Laboratory, the OGI School of Science and Engineering, and Portland State University. <br />
<br />
Melanie has served as Director of the Santa Fe Institute’s Complex Systems Summer School; at Portland State University she teaches, among other courses, Exploring Complexity in Science and Technology.<br />
<br />
Her major work is in the areas of analogical reasoning, complex systems, genetic algorithms and cellular automata, and her publications in those fields are frequently cited. She is the author of ''An Introduction to Genetic Algorithms'', a widely known introductory book published by MIT Press in 1996. Her most recent book is ''Complexity: A Guided Tour'' named by Amazon.com as one of the 10 best science books of 2009.<br />
<br /><br />
<br />
----<br />
[[Image:Moore.jpg|200px|left]]<br />
'''Cris Moore''', Professor of Computer Science, University of New Mexico<br />
<br />
Cris is a Professor in the Computer Science Department at the University of New Mexico, with a joint appointment in the Department of Physics and Astronomy. He is also a Professor at the Santa Fe Institute. Cris studies interesting things like quantum computation (especially post-quantum cryptography and the possibility of algorithms for Graph Isomorphism), phase transitions in NP-complete problems (e.g. the colorability of random graphs, or the satisfiability of random formulas) and social networks (in particular, automated techniques for identifying important structural features of large networks).<br />
<br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:Shaw.jpg|200px|left]]<br />
'''Rob Shaw''', Research Scientist, ProtoLife, Inc., Venica, Italy<br />
<br><br />
<br><br />
"Dominos, Ergodic Flows": We present a model, developed with Norman Packard, of<br />
a simple discrete open flow system. Dimers are created at one edge<br />
of a two-dimensional lattice, diffuse across, and are removed at the opposite side.<br />
A steady-state flow is established, under various kinetic rules.<br />
In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem,<br />
whose entropy as a function of density is known. This entropy density is reproduced<br />
locally in the flow system, as shown by statistics over local templates. The goal is to<br />
clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:Still.jpg|200px|left]]<br />
'''Susanne Still''', Assistant Professor of Computer Science, Department of Information and Computer Sciences, University of Hawaii at Manoa.<br />
<br />
Most research in learning theory deals with passive learning. However, many real world learning problems are interactive, and so is animal learning. <br />
<br />
The theoretical foundations for interactive learning and behavior are much less developed than those for passive learning. A theoretical understanding of behavioral learning lies at the heart of a new generation of machine intelligence, and is also at the core of many interesting questions about adaptation and learning in biology. <br />
<br><br />
<br><br />
<br><br />
<br><br />
----<br />
[[Image:VilelaMendes_2000.jpg|200px|left]]<br />
'''Rui Vilela-Mendes''', Professor of Mathematics, Instituto Superior Tecnico, Lisboa, Portugal.<br />
Rui received an Electrical Engineering degree from the Technical University (IST)- Lisbon, a Ph. D. in Physics from the University of Texas (Austin) and an Habilitation in Mathematics from the University of Lisbon. He is currently a member of the Center for Mathematics and Applications (CMAF-UL) and of the Institute for Plasmas and Nuclear Fusion (IPFN-IST) as well as a member of the Lisbon Academy of Sciences. He was a visiting researcher at CERN, CNRS (Marseille), IHES (Bures), Univ. of Bielefeld and co-organizer and collaborator of several international research projects on Theoretical Physics and the Sciences of Complexity.<br />
<br />
Over the last few decades Rui has worked in the areas of mathematical economics, nonlinear dynamics and control, stochastic processes and quantum theory. My current research interests center on mathematical economics, the physics of complexity, control and quantum computing.<br />
<br><br />
<br />
----<br />
[[Image:Wiesner.png|200px|left]]<br />
'''Karoline Wiesner''', Assistant Professor, School of Mathematics and Centre for Complexity Sciences, University of Bristol<br />
<br />
Bateson defines information as “a difference that makes a difference”. Complexity is “when quantitative differences become qualitative differences.” We need information theory to identify this difference. Key to my work is coming up with good measures of complexity for classical (biological) and quantum systems. The goal is to build a tool set for identifying and measuring structure. Part of this tool set is a hierarchy of classical and quantum computational architectures. How difficult it is to generate a given structure determines how high up in this architectural hierarchy its representation is found.<br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br><br />
<br />
----<br />
<br />
== Observers ==<br />
<br />
[[Image:Trabesinger.jpg|200px|left]]<br />
'''Andreas Trabesinger''', Senior Editor, Nature Physics<br />
<br />
Throughout his doctoral and post-doctoral studies, Andreas focused on various aspects of nuclear magnetic resonance, including application to monitoring brain metabolism and NMR at very low magnetic fields. After graduating from the physics department of ETH-Zürich in 2000, he conducted research at the Institute of Biomedical Engineering and in the Laboratory of Physical Chemistry at ETH, as well as at the Department of Chemistry at Berkeley, where he collaborated with the condensed-matter and atomic physics groups.<br />
<br><br />
<br></div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=39004Randomness, Structure and Causality - Abstracts2011-01-07T01:02:23Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''Effective Complexity of Stationary Process Realizations'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. Based on their work we gave a precise definition of effective complexity of finite binary strings in terms of algorithmic information theory in our previous paper. Here we study the effective complexity of strings generated by stationary processes. Sufficiently long typical process realizations turn out to be effectively simple under any linear scaling with the string's length of the parameter $\Delta$ which determines the minimization domain. For a class of computable ergodic processes including i.i.d. and ergodic Markovian processes a stronger result can be shown: There exist sublinear scalings of $\Delta$ for which typical realizations turn out to be effectively simple. Our results become most transparent in the context of coarse effective complexity --a modification of plain effective complexity, where $\Delta$ appears as a minimization argument. A similar modification of the closely related concept of sophistication has been invented by Antunes and Fortnow as coarse sophistication.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Information Aggregation in Correlated Complex Systems and Optimal Estimation'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
Information is a peculiar quantity. Unlike matter and energy, which are conserved by the laws of physics, the aggregation of knowledge from many sources can in fact produce more information (synergy) or less (redundancy) than the sum of its parts, provided these sources are correlated. I discuss how the formal properties of information aggregation - expressed in information theoretic terms - provide a general window for explaining features of organization in several complex systems. I show under what circumstances collective coordination may pay off in stochastic search problems, how this can be used to estimate functional relations between neurons in living neural tissue and more generally how it may have implications for other network structures in social and biological systems.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity'''<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Dominos, Ergodic Flows'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We present a model, developed with Norman Packard, of a simple discrete open flow system. Dimers are created at one edge of a two-dimensional lattice, diffuse across, and are removed at the opposite side. A steady-state flow is established, under various kinetic rules. In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem, whose entropy as a function of density is known. This entropy density is reproduced locally in the flow system, as shown by statistics over local templates. The goal is to clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity'''<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk)<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Chaoshttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Participants&diff=39003Randomness, Structure and Causality - Participants2011-01-07T01:01:05Z<p>Chaos: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<table border="1"><br />
<tr><br />
<th>Name</th><br />
<th>Email</th><br />
<th>Institution</th><br />
<th>Talk</th><br />
<th>Paper</th><br />
</tr><br />
<tr><br />
<td>*Ay, Nihat</td><br />
<td>nay@mis.mpg.de</td> <br />
<td>SFI & Max Planck Institute</td><br />
<td>Effective Complexity of Stationary Process Realizations</td><br />
<td>[[http://arxiv.org/abs/1001.2686]]</td><br />
</tr><br />
<tr><br />
<td>*Bell, Tony</td><br />
<td>tony@salk.edu</td><br />
<td>UC Berkeley</td><br />
<td>Learning Out of Equilibrium</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Bettencourt, Luis</td><br />
<td>lmbettencourt@gmail.com</td><br />
<td>SFI & LANL</td><br />
<td>Information Aggregation in Correlated Complex Systems and Optimal Estimation</td><br />
<td>[[http://arxiv.org/abs/0712.2218]]</td><br />
</tr><br />
<tr><br />
<td>*Chaitin, Gregory</td><br />
<td>gjchaitin@gmail.com</td><br />
<td>IBM Watson Research Center</td><br />
<td>To a Mathematical Theory of Evolution and Biological Creativity</td><br />
<td>[[Media:Darwin.pdf| Paper]]</td><br />
</tr><br />
<tr> <br />
<td>*Crutchfield, James</td><br />
<td>chaos@cse.ucdavis.edu</td><br />
<td>SFI & UC Davis</td><br />
<td>Framing Complexity</td><br />
<td>[[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]</td><br />
</tr><br />
<tr><br />
<td>*Debowski, Lukasz</td><br />
<td>ldebowsk@ipipan.waw.pl</td><br />
<td>Polish Academy of Sciences</td><br />
<td>The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts</td><br />
<td>[[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]</td><br />
</tr><br />
<tr><br />
<td>*Ellison, Christopher</td><br />
<td>cellison@cse.ucdavis.edu</td><br />
<td>UC Davis</td><br />
<td>Prediction, Retrodiction, and the Amount of Information Stored in the Present</td><br />
<td>[[http://arxiv.org/abs/0905.3587]]</td><br />
</tr><br />
<tr><br />
<td>*Feldman, David</td><br />
<td>dave@hornacek.coa.edu</td><br />
<td>College of the Atlantic</td><br />
<td>Complexity Measures and Frustration</td><br />
<td>[[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]] </td><br />
</tr><br />
<tr><br />
<td>*Mahoney, John</td><br />
<td>jmahoney3@ucmerced.edu</td><br />
<td>UC Merced</td><br />
<td>Crypticity and Information Accessibility</td><br />
<td>[[http://arxiv.org/abs/0905.4787]]</td><br />
</tr><br />
<tr><br />
<td>*Machta, Jon</td><br />
<td>machta@physics.umass.edu</td><br />
<td>SFI & University of Massachusetts</td><br />
<td>Complexity, Parallel Computation and Statistical Physics</td><br />
<td>[[http://arxiv.org/abs/cond-mat/0510809]]</td><br />
</tr><br />
<tr><br />
<td>*Mitchell, Melanie</td><br />
<td>mm@cs.pdx.edu</td><br />
<td>SFI & Portland State University</td><br />
<td>Automatic Identification of Information-Processing Structures in Cellular Automata</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Moore, Cris</td><br />
<td>moore@cs.unm</td><br />
<td>SFI & University of New Mexico</td><br />
<td>Phase Transitions and Computational Complexity</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Shaw, Robert Stetson</td><br />
<td>rob@protolife.net</td><br />
<td>ProtoLife, Inc.</td><br />
<td>Dominos, Ergodic Flows</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Still, Suzanne</td><br />
<td>sstill@hawaii.edu</td><br />
<td>University of Hawaii at Manoa</td><br />
<td>Statistical Mechanics of Interactive Learning</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Vilela-Mendes, Rui</td><br />
<td>vilela@cii.fc.ul.pt</td><br />
<td>University of Lisbon</td><br />
<td>Ergodic Parameters and Dynamical Complexity</td><br />
<td>[[http://arxiv.org/abs/1008.2664]]</td><br />
</tr><br />
<tr><br />
<td>Trabesinger, Andreas</td><br />
<td>a.trabesinger@nature.com</td><br />
<td>Nature Physics Magazine</td><br />
<td>Measuring Complexity?</td><br />
<td>[[http://www.nature.com/nphys/]]</td><br />
</tr><br />
<tr><br />
<td>*Wiesner, Karoline</td><br />
<td>k.wiesner@bristol.ac.uk</td><br />
<td>University of Bristol</td><br />
<td>Hidden Quantum Markov Models and Non-adaptive Read-out of Many-body States</td><br />
<td>[[http://arxiv.org/abs/1002.2337]]</td><br />
</tr><br />
</table><br />
<nowiki>*Confirmed</nowiki></div>Chaos