Actions

Randomness, Structure and Causality - Agenda: Difference between revisions

From Santa Fe Institute Events Wiki

No edit summary
No edit summary
 
(13 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{Randomness, Structure and Causality}}
{{Randomness, Structure and Causality}}


 
[[Media:Agenda.pdf|Agenda PDF]]
== Abstracts ==
 
<br>
----
'''Effective Complexity of Stationary Process Realizations'''
<br>
<br>
Ay, Nihat (nay@mis.mpg.de)
<br>
SFI & Max Planck Institute
<br>
<br>
Links: [[http://arxiv.org/abs/1001.2686]]
----
'''Learning Out of Equilibrium'''
<br>
Bell, Tony (tony@salk.edu)
<br>
UC Berkeley
<br>
<br>
Links:
----
'''The Transmission of Sense Information'''
<br>
<br>
Bergstrom, Carl (cbergst@u.washington.edu)
<br>
SFI & University of Washington
<br>
<br>
Links: [[http://arxiv.org/abs/0810.4168]]
----
'''Optimizing Information Flow in Small Genetic Networks'''
<br>
<br>
Bialek, William (wbialek@Princeton.EDU)
<br>
Princeton University
<br>
<br>
Links: [[http://arxiv.org/abs/0912.5500]]
----
'''To a Mathematical Theory of Evolution and Biological Creativity'''
<br>
<br>
Chaitin, Gregory (gjchaitin@gmail.com)
<br>
IBM Watson Research Center
<br>
<br>
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.
<br>
<br>
Links: [[File:Darwin.pdf]]
----
'''Framing Complexity'''
<br>
<br>
Crutchfield, James (chaos@cse.ucdavis.edu)<br>
SFI & UC Davis
<br>
<br>
Is there a theory of complex systems? And who should care, anyway?
<br>
<br>
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]
 
----
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''
<br>
 
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br>
Polish Academy of Sciences<br>
<br>
<p>
We will present a new explanation for the distribution of words in
natural language which is grounded in information theory and inspired
by recent research in excess entropy. Namely, we will demonstrate a
theorem with the following informal statement: If a text of length <math>n</math>
describes <math>n^\beta</math> independent facts in a repetitive way then the
text contains at least <math>n^\beta/\log n</math> different words.  In the
formal statement, two modeling postulates are adopted. Firstly, the
words are understood as nonterminal symbols of the shortest
grammar-based encoding of the text. Secondly, the text is assumed to
be emitted by a finite-energy strongly nonergodic source whereas the
facts are binary IID variables predictable in a shift-invariant
way. Besides the theorem, we will exhibit a few stochastic processes
to which this and similar statements can be related.
<br>
<br>
 
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]
 
----
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''
<br>
<br>
Ellison, Christopher (cellison@cse.ucdavis.edu)<br>
Complexity Sciences Center, UC Davis
<br>
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.
<br>
<br>
Links: [[http://arxiv.org/abs/0905.3587]]
 
----
'''Spatial Information Theory'''
<br>
<br>
Feldman, David (dave@hornacek.coa.edu)<br>
College of the Atlantic
<br>
<br>
Links: [[File:afm.tri.5.pdf]] and [[File:CHAOEH184043106_1.pdf
----
'''Crypticity and Information Accessibility'''
<br><br>
Mahoney, John (jmahoney3@ucmerced.edu)<br>
UC Merced
<br>
<br>
Links: [[http://arxiv.org/abs/0905.4787]]
----
'''Complexity, Parallel Computation and Statistical Physics'''
<br><br>
Machta, Jon (machta@physics.umass.edu)
<br>
SFI & University of Massachusetts
<br>
<br>
Links: [[http://arxiv.org/abs/cond-mat/0510809]]
 
----
 
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''
<br>
<br>
Mitchell, Melanie (mm@cs.pdx.edu)
<br>
SFI & Portland State University
<br>
<br>
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems.  An open question is how to best understand such systems in terms of their information-processing capabilities.   In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al.,  the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al.   We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.
 
----
'''Statistical Mechanics of Interactive Learning'''
<br>
<br>
Still, Suzanne (sstill@hawaii.edu)<br>
University of Hawaii at Manoa
<br><br>
Links:
----
'''Measuring the Complexity of Psychological States'''
<br>
<br>
Tononi, Guilio (gtononi@wisc.edu)<br>
University of Michigan
<br><br>
Links:
----
'''Ergodic Parameters and Dynamical Complexity'''
<br>
<br>
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)
<br>
University of Lisbon
<br>
<br>
Using a cocycle formulation, old and new ergodic parameters beyond the
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies
and fluctuations of the local expansion rate are related by a generalization
of the Pesin formula.
How the ergodic parameters may be used to characterize the complexity of
dynamical systems is illustrated by some examples: Clustering and
synchronization, self-organized criticality and the topological structure of
networks.
<br>
<br>
Links: [[http://arxiv.org/abs/1008.2664]]
----
'''Hidden Quantum Markov Models and Non-adaptive Read-out of Many-body States'''
<br>
<br>
Wiesner, Karoline (k.wiesner@bristol.ac.uk)
<br>
University of Bristol
<br>
<br>
Links: [[http://arxiv.org/abs/1002.2337]]
----

Latest revision as of 01:17, 3 January 2011