

(14 intermediate revisions by 2 users not shown) 
Line 1: 
Line 1: 
 {{Randomness, Structure and Causality}}   {{Randomness, Structure and Causality}} 

 

   [[Media:Agenda.pdfAgenda PDF]] 
 == Abstracts ==
 
  
 <br>
 
 
 
 '''Effective Complexity of Stationary Process Realizations'''
 
 <br>
 
 <br>
 
 Ay, Nihat (nay@mis.mpg.de)
 
 <br>
 
 SFI & Max Planck Institute
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/1001.2686]]
 
 
 
 '''Learning Out of Equilibrium'''
 
 <br>
 
 Bell, Tony (tony@salk.edu)
 
 <br>
 
 UC Berkeley
 
 <br>
 
 <br>
 
 Links:
 
 
 
 '''The Transmission of Sense Information'''
 
 <br>
 
 <br>
 
 Bergstrom, Carl (cbergst@u.washington.edu)
 
 <br>
 
 SFI & University of Washington
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/0810.4168]]
 
 
 
 '''Optimizing Information Flow in Small Genetic Networks'''
 
 <br>
 
 <br>
 
 Bialek, William (wbialek@Princeton.EDU)
 
 <br>
 
 Princeton University
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/0912.5500]]
 
 
 
 '''To a Mathematical Theory of Evolution and Biological Creativity'''
 
 <br>
 
 <br>
 
 Chaitin, Gregory (gjchaitin@gmail.com)
 
 <br>
 
 IBM Watson Research Center
 
 <br>
 
 <br>
 
 We present an informationtheoretic analysis of Darwin’s theory of evolution, modeled as a hillclimbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.
 
 <br>
 
 <br>
 
 Links: [[File:Darwin.pdf]]
 
 
 
 '''Framing Complexity'''
 
 <br>
 
 <br>
 
 Crutchfield, James (chaos@cse.ucdavis.edu)<br>
 
 SFI & UC Davis
 
 <br>
 
 <br>
 
 Is there a theory of complex systems? And who should care, anyway?
 
 <br>
 
 <br>
 
 Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]
 
  
 
 
 '''The Vocabulary of GrammarBased Codes and the Logical Consistency of Texts'''
 
 <br>
 
  
 Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br>
 
 Polish Academy of Sciences<br>
 
 <br>
 
 <p>
 
 We will present a new explanation for the distribution of words in
 
 natural language which is grounded in information theory and inspired
 
 by recent research in excess entropy. Namely, we will demonstrate a
 
 theorem with the following informal statement: If a text of length <math>n</math>
 
 describes <math>n^\beta</math> independent facts in a repetitive way then the
 
 text contains at least <math>n^\beta/\log n</math> different words. In the
 
 formal statement, two modeling postulates are adopted. Firstly, the
 
 words are understood as nonterminal symbols of the shortest
 
 grammarbased encoding of the text. Secondly, the text is assumed to
 
 be emitted by a finiteenergy strongly nonergodic source whereas the
 
 facts are binary IID variables predictable in a shiftinvariant
 
 way. Besides the theorem, we will exhibit a few stochastic processes
 
 to which this and similar statements can be related.
 
 <br>
 
 <br>
 
  
 Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]
 
  
 
 
 '''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''
 
 <br>
 
 <br>
 
 Ellison, Christopher (cellison@cse.ucdavis.edu)<br>
 
 Complexity Sciences Center, UC Davis
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/0905.3587]]
 
  
 
 
 '''Spatial Information Theory'''
 
 <br>
 
 <br>
 
 Feldman, David (dave@hornacek.coa.edu)<br>
 
 College of the Atlantic
 
 <br>
 
 <br>
 
 Links: [[File:afm.tri.5.pdf]] and [[File:CHAOEH184043106_1.pdf
 
 
 
 '''Crypticity and Information Accessibility'''
 
 <br><br>
 
 Mahoney, John (jmahoney3@ucmerced.edu)<br>
 
 UC Merced
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/0905.4787]]
 
 
 
 '''Complexity, Parallel Computation and Statistical Physics'''
 
 <br><br>
 
 Machta, Jon (machta@physics.umass.edu)
 
 <br>
 
 SFI & University of Massachusetts
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/condmat/0510809]]
 
  
 
 
  
 '''Automatic Identification of InformationProcessing Structures in Cellular Automata'''
 
 <br>
 
 <br>
 
 Mitchell, Melanie (mm@cs.pdx.edu)
 
 <br>
 
 SFI & Portland State University
 
 <br>
 
 <br>
 
 Cellular automata have been widely used as idealized models of natural spatiallyextended dynamical systems. An open question is how to best understand such systems in terms of their informationprocessing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one and twodimensional cellular automata that have been designed to perform the socalled density (or majority) classification task.
 
  
 
 
 '''Statistical Mechanics of Interactive Learning'''
 
 <br>
 
 <br>
 
 Still, Suzanne (sstill@hawaii.edu)<br>
 
 University of Hawaii at Manoa
 
 <br><br>
 
 Links:
 
 
 
 '''Measuring the Complexity of Psychological States'''
 
 <br>
 
 <br>
 
 Tononi, Guilio (gtononi@wisc.edu)<br>
 
 University of Michigan
 
 <br><br>
 
 Links:
 
 
 
 '''Ergodic Parameters and Dynamical Complexity'''
 
 <br>
 
 <br>
 
 VilelaMendes, Rui (vilela@cii.fc.ul.pt)
 
 <br>
 
 University of Lisbon
 
 <br>
 
 <br>
 
 Using a cocycle formulation, old and new ergodic parameters beyond the
 
 Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies
 
 and fluctuations of the local expansion rate are related by a generalization
 
 of the Pesin formula.
 
 How the ergodic parameters may be used to characterize the complexity of
 
 dynamical systems is illustrated by some examples: Clustering and
 
 synchronization, selforganized criticality and the topological structure of
 
 networks.
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/1008.2664]]
 
 
 
 '''Hidden Quantum Markov Models and Nonadaptive Readout of Manybody States'''
 
 <br>
 
 <br>
 
 Wiesner, Karoline (k.wiesner@bristol.ac.uk)
 
 <br>
 
 University of Bristol
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/1002.2337]]
 
 
 