Actions

Module:Complexity: Difference between revisions

From Santa Fe Institute Events Wiki

(→‎Readings: add a few papers)
Line 66: Line 66:


'''[http://arxiv.org/abs/1012.1890 A measure of statistical complexity based on predictive information]'''
'''[http://arxiv.org/abs/1012.1890 A measure of statistical complexity based on predictive information]'''
'''[http://arxiv.org/abs/1105.1306 Excess entropy in natural language: present state and perspectives]'''


==Online Labs==
==Online Labs==


'''[http://172.29.16.101:8000 CMPy Notebook]'''
'''[http://172.29.16.101:8000 CMPy Notebook]'''

Revision as of 20:22, 27 June 2011

Complex Systems Summer School 2011 Modules

Organized by Jim Crutchfield

Background

  • T. Cover and J.Thomas, Elements of Information Theory, Wiley, Second Edition (2006) Chapters 1 - 7.
  • M. Li and P.M.B.Vitanyi, An Introduction to Kolmogorov Complexity and its Applications , Springer, New York (1993).
  • J. P. Crutchfield and D. P. Feldman, “Regularities Unseen, Randomness Observed: Levels of Entropy Convergence”,CHAOS 13:1 (2003) 25-54.

Lecture Notes

Ryan's talks

  1. Anatomy of an Observation
  2. Length Scales in Complex Time Series

Readings

The Calculi of Emergence

An Information-Theoretic Primer on Complexity, Self-Organisation and Emergence

Pattern Discovery and Computational Mechanics

Many Roads to Synchrony: Natural Time Scales and Their Algorithms

Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation

Information Accessibility and Cryptic Processes

Prediction, Retrodiction, and The Amount of Information Stored in the Present

The Organization of Intrinsic Computation: Complexity-Entropy Diagrams and the Diversity of Natural Information Processing

Anatomy of a Bit

Information modification and particle collisions in distributed computation

Reconstruction of Epsilon-Machines in Predictive Frameworks and Decisional States

A measure of statistical complexity based on predictive information

Excess entropy in natural language: present state and perspectives

Online Labs

CMPy Notebook