Actions

Difference between revisions of "Randomness, Structure and Causality - Agenda"

From Santa Fe Institute Events Wiki

 
(8 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
{{Randomness, Structure and Causality}}
 
{{Randomness, Structure and Causality}}
  
 
+
[[Media:Agenda.pdf|Agenda PDF]]
== Abstracts ==
 
 
 
<br>
 
----
 
'''Effective Complexity of Stationary Process Realizations'''
 
<br>
 
<br>
 
Ay, Nihat (nay@mis.mpg.de)
 
<br>
 
SFI & Max Planck Institute
 
<br>
 
<br>
 
Links: [[http://arxiv.org/abs/1001.2686]]
 
----
 
'''Learning Out of Equilibrium'''
 
<br>
 
Bell, Tony (tony@salk.edu)
 
<br>
 
UC Berkeley
 
<br>
 
<br>
 
Links:
 
----
 
'''The Transmission of Sense Information'''
 
<br>
 
<br>
 
Bergstrom, Carl (cbergst@u.washington.edu)
 
<br>
 
SFI & University of Washington
 
<br>
 
<br>
 
Links: [[http://arxiv.org/abs/0810.4168]]
 
----
 
'''Optimizing Information Flow in Small Genetic Networks'''
 
<br>
 
<br>
 
Bialek, William (wbialek@Princeton.EDU)
 
<br>
 
Princeton University
 
<br>
 
<br>
 
Links: [[http://arxiv.org/abs/0912.5500]]
 
----
 
'''To a Mathematical Theory of Evolution and Biological Creativity'''
 
<br>
 
<br>
 
Chaitin, Gregory (gjchaitin@gmail.com)
 
<br>
 
IBM Watson Research Center
 
<br>
 
<br>
 
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.
 
<br>
 
<br>
 
Links: [[File:Darwin.pdf]]
 
----
 
'''Framing Complexity'''
 
<br>
 
<br>
 
Crutchfield, James (chaos@cse.ucdavis.edu)<br>
 
SFI & UC Davis
 
<br>
 
<br>
 
Is there a theory of complex systems? And who should care, anyway?
 
<br>
 
<br>
 
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]
 
 
 
----
 
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''
 
<br>
 
 
 
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br>
 
Polish Academy of Sciences<br>
 
<br>
 
<p>
 
We will present a new explanation for the distribution of words in
 
natural language which is grounded in information theory and inspired
 
by recent research in excess entropy. Namely, we will demonstrate a
 
theorem with the following informal statement: If a text of length <math>n</math>
 
describes <math>n^\beta</math> independent facts in a repetitive way then the
 
text contains at least <math>n^\beta/\log n</math> different words.  In the
 
formal statement, two modeling postulates are adopted. Firstly, the
 
words are understood as nonterminal symbols of the shortest
 
grammar-based encoding of the text. Secondly, the text is assumed to
 
be emitted by a finite-energy strongly nonergodic source whereas the
 
facts are binary IID variables predictable in a shift-invariant
 
way. Besides the theorem, we will exhibit a few stochastic processes
 
to which this and similar statements can be related.
 
<br>
 
<br>
 
 
 
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]
 
 
 
----
 
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''
 
<br>
 
<br>
 
Ellison, Christopher (cellison@cse.ucdavis.edu)<br>
 
Complexity Sciences Center, UC Davis
 
<br>
 
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.
 
<br>
 
<br>
 
Links: [[http://arxiv.org/abs/0905.3587]]
 
 
 
----
 
'''Complexity Measures and Frustration'''
 
<br>
 
<br>
 
Feldman, David (dave@hornacek.coa.edu)<br>
 
College of the Atlantic
 
<br>
 
<br>
 
In this talk I will present some new results applying complexity
 
measures to frustrated systems, and I will also comment on some
 
frustrations I have about past and current work in complexity
 
measures.  I will conclude with a number of open questions and ideas
 
for future research.
 
 
 
I will begin with a quick review of the excess entropy/predictive
 
information and argue that it is a well understood and broadly
 
applicable measure of complexity that allows for a comparison of
 
information processing abilities among very different systems.  The
 
vehicle for this comparison is the complexity-entropy diagram, a
 
scatter-plot of the entropy and excess entropy as model parameters are
 
varied.  This allows for a direct comparison in terms of the
 
configurations' intrinsic information processing properties.  To
 
illustrate this point, I will show complexity-entropy diagrams for: 1D
 
and 2D Ising models, 1D Cellular Automata, the logistic map, an
 
ensemble of Markov chains, and an ensemble of epsilon-machines.
 
 
 
I will then present some new work in which a local form of the 2D
 
excess entropy is calculated for a frustrated spin system.  This
 
allows one to see how information and memory are shared unevenly
 
across the lattice as the system enters a glassy state.  These results
 
show that localised information theoretic complexity measures can be
 
usefully applied to heterogeneous lattice systems.  I will argue that
 
local complexity measures for higher-dimensional and heterogeneous
 
systems is a particularly fruitful area for future research.
 
 
 
Finally, I will conclude by remarking upon some of the areas of
 
complexity-measure research that have been sources of frustration.
 
These include the persistent notions of a universal "complexity at
 
the edge of chaos," and the relative lack of applications of
 
complexity measures to empirical data and/or multidimensional systems.
 
These remarks are designed to provoke dialog and discussion about
 
interesting and fun areas for future research.
 
<br>
 
<br>
 
Links: [[File:afm.tri.5.pdf]] and [[File:CHAOEH184043106_1.pdf]]
 
----
 
'''Complexity, Parallel Computation and Statistical Physics'''
 
<br>
 
<br>
 
Machta, Jon (machta@physics.umass.edu)
 
<br>
 
SFI & University of Massachusetts
 
<br>
 
<br>
 
Links: [[http://arxiv.org/abs/cond-mat/0510809]]
 
----
 
'''Crypticity and Information Accessibility'''
 
<br><br>
 
Mahoney, John (jmahoney3@ucmerced.edu)<br>
 
UC Merced
 
<br>
 
<br>
 
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.
 
<br>
 
<br>
 
Links: [[http://arxiv.org/abs/0905.4787]]
 
 
 
----
 
 
 
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''
 
<br>
 
<br>
 
Mitchell, Melanie (mm@cs.pdx.edu)
 
<br>
 
SFI & Portland State University
 
<br>
 
<br>
 
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems.  An open question is how to best understand such systems in terms of their information-processing capabilities.   In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al.,  the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al.   We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.
 
----
 
'''Phase Transitions and Computational Complexity'''
 
<br>
 
<br>
 
Moore, Cris (moore@cs.unm.edu)
 
<br>
 
SFI & University of New Mexico
 
<br>
 
<br>
 
Links:
 
 
 
----
 
'''Statistical Mechanics of Interactive Learning'''
 
<br>
 
<br>
 
Still, Suzanne (sstill@hawaii.edu)<br>
 
University of Hawaii at Manoa
 
<br><br>
 
Links:
 
----
 
'''Measuring the Complexity of Psychological States'''
 
<br>
 
<br>
 
Tononi, Guilio (gtononi@wisc.edu)<br>
 
University of Michigan
 
<br><br>
 
Links:
 
----
 
'''Ergodic Parameters and Dynamical Complexity'''
 
<br>
 
<br>
 
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)
 
<br>
 
University of Lisbon
 
<br>
 
<br>
 
Using a cocycle formulation, old and new ergodic parameters beyond the
 
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies
 
and fluctuations of the local expansion rate are related by a generalization
 
of the Pesin formula.
 
How the ergodic parameters may be used to characterize the complexity of
 
dynamical systems is illustrated by some examples: Clustering and
 
synchronization, self-organized criticality and the topological structure of
 
networks.
 
<br>
 
<br>
 
Links: [[http://arxiv.org/abs/1008.2664]]
 
----
 
'''Hidden Quantum Markov Models and Non-adaptive Read-out of Many-body States'''
 
<br>
 
<br>
 
Wiesner, Karoline (k.wiesner@bristol.ac.uk)
 
<br>
 
University of Bristol
 
<br>
 
<br>
 
Links: [[http://arxiv.org/abs/1002.2337]]
 
----
 

Latest revision as of 01:17, 3 January 2011