Randomness, Structure and Causality - Agenda
From Santa Fe Institute Events Wiki
Workshop Navigation |
Abstracts
Effective Complexity of Stationary Process Realizations
Ay, Nihat (nay@mis.mpg.de)
SFI & Max Planck Institute
Links: [[1]]
Learning Out of Equilibrium
Bell, Tony (tony@salk.edu)
UC Berkeley
Links:
The Transmission of Sense Information
Bergstrom, Carl (cbergst@u.washington.edu)
SFI & University of Washington
Links: [[2]]
Optimizing Information Flow in Small Genetic Networks
Bialek, William (wbialek@Princeton.EDU)
Princeton University
Links: [[3]]
To a Mathematical Theory of Evolution and Biological Creativity
Chaitin, Gregory (gjchaitin@gmail.com)
IBM Watson Research Center
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.
Links: File:Darwin.pdf
Framing Complexity
Crutchfield, James (chaos@cse.ucdavis.edu)
SFI & UC Davis
Is there a theory of complex systems? And who should care, anyway?
Links: [[4]]
The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)
Polish Academy of Sciences
We will present a new explanation for the distribution of words in
natural language which is grounded in information theory and inspired
by recent research in excess entropy. Namely, we will demonstrate a
theorem with the following informal statement: If a text of length
describes independent facts in a repetitive way then the
text contains at least different words. In the
formal statement, two modeling postulates are adopted. Firstly, the
words are understood as nonterminal symbols of the shortest
grammar-based encoding of the text. Secondly, the text is assumed to
be emitted by a finite-energy strongly nonergodic source whereas the
facts are binary IID variables predictable in a shift-invariant
way. Besides the theorem, we will exhibit a few stochastic processes
to which this and similar statements can be related.
Links: [[5]] and [[6]]
Prediction, Retrodiction, and the Amount of Information Stored in the Present
Ellison, Christopher (cellison@cse.ucdavis.edu)
Complexity Sciences Center, UC Davis
We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.
Links: [[7]]
Spatial Information Theory
Feldman, David (dave@hornacek.coa.edu)
College of the Atlantic
Links: File:Afm.tri.5.pdf and [[File:CHAOEH184043106_1.pdf
Crypticity and Information Accessibility
Mahoney, John (jmahoney3@ucmerced.edu)
UC Merced
Links: [[8]]
Complexity, Parallel Computation and Statistical Physics
Machta, Jon (machta@physics.umass.edu)
SFI & University of Massachusetts
Links: [[9]]
Automatic Identification of Information-Processing Structures in Cellular Automata
Mitchell, Melanie (mm@cs.pdx.edu)
SFI & Portland State University
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.
Statistical Mechanics of Interactive Learning
Still, Suzanne (sstill@hawaii.edu)
University of Hawaii at Manoa
Links:
Measuring the Complexity of Psychological States
Tononi, Guilio (gtononi@wisc.edu)
University of Michigan
Links:
Ergodic Parameters and Dynamical Complexity
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)
University of Lisbon
Using a cocycle formulation, old and new ergodic parameters beyond the
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies
and fluctuations of the local expansion rate are related by a generalization
of the Pesin formula.
How the ergodic parameters may be used to characterize the complexity of
dynamical systems is illustrated by some examples: Clustering and
synchronization, self-organized criticality and the topological structure of
networks.
Links: [[10]]
Hidden Quantum Markov Models and Non-adaptive Read-out of Many-body States
Wiesner, Karoline (k.wiesner@bristol.ac.uk)
University of Bristol
Links: [[11]]