

(8 intermediate revisions by 2 users not shown) 
Line 1: 
Line 1: 
 {{Randomness, Structure and Causality}}   {{Randomness, Structure and Causality}} 
   
−   +  [[Media:Agenda.pdfAgenda PDF]] 
−  == Abstracts ==
 
−   
−  <br>
 
−  
 
−  '''Effective Complexity of Stationary Process Realizations'''
 
−  <br>
 
−  <br>
 
−  Ay, Nihat (nay@mis.mpg.de)
 
−  <br>
 
−  SFI & Max Planck Institute
 
−  <br>
 
−  <br>
 
−  Links: [[http://arxiv.org/abs/1001.2686]]
 
−  
 
−  '''Learning Out of Equilibrium'''
 
−  <br>
 
−  Bell, Tony (tony@salk.edu)
 
−  <br>
 
−  UC Berkeley
 
−  <br>
 
−  <br>
 
−  Links:
 
−  
 
−  '''The Transmission of Sense Information'''
 
−  <br>
 
−  <br>
 
−  Bergstrom, Carl (cbergst@u.washington.edu)
 
−  <br>
 
−  SFI & University of Washington
 
−  <br>
 
−  <br>
 
−  Links: [[http://arxiv.org/abs/0810.4168]]
 
−  
 
−  '''Optimizing Information Flow in Small Genetic Networks'''
 
−  <br>
 
−  <br>
 
−  Bialek, William (wbialek@Princeton.EDU)
 
−  <br>
 
−  Princeton University
 
−  <br>
 
−  <br>
 
−  Links: [[http://arxiv.org/abs/0912.5500]]
 
−  
 
−  '''To a Mathematical Theory of Evolution and Biological Creativity'''
 
−  <br>
 
−  <br>
 
−  Chaitin, Gregory (gjchaitin@gmail.com)
 
−  <br>
 
−  IBM Watson Research Center
 
−  <br>
 
−  <br>
 
−  We present an informationtheoretic analysis of Darwin’s theory of evolution, modeled as a hillclimbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.
 
−  <br>
 
−  <br>
 
−  Links: [[File:Darwin.pdf]]
 
−  
 
−  '''Framing Complexity'''
 
−  <br>
 
−  <br>
 
−  Crutchfield, James (chaos@cse.ucdavis.edu)<br>
 
−  SFI & UC Davis
 
−  <br>
 
−  <br>
 
−  Is there a theory of complex systems? And who should care, anyway?
 
−  <br>
 
−  <br>
 
−  Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]
 
−   
−  
 
−  '''The Vocabulary of GrammarBased Codes and the Logical Consistency of Texts'''
 
−  <br>
 
−   
−  Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br>
 
−  Polish Academy of Sciences<br>
 
−  <br>
 
−  <p>
 
−  We will present a new explanation for the distribution of words in
 
−  natural language which is grounded in information theory and inspired
 
−  by recent research in excess entropy. Namely, we will demonstrate a
 
−  theorem with the following informal statement: If a text of length <math>n</math>
 
−  describes <math>n^\beta</math> independent facts in a repetitive way then the
 
−  text contains at least <math>n^\beta/\log n</math> different words. In the
 
−  formal statement, two modeling postulates are adopted. Firstly, the
 
−  words are understood as nonterminal symbols of the shortest
 
−  grammarbased encoding of the text. Secondly, the text is assumed to
 
−  be emitted by a finiteenergy strongly nonergodic source whereas the
 
−  facts are binary IID variables predictable in a shiftinvariant
 
−  way. Besides the theorem, we will exhibit a few stochastic processes
 
−  to which this and similar statements can be related.
 
−  <br>
 
−  <br>
 
−   
−  Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]
 
−   
−  
 
−  '''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''
 
−  <br>
 
−  <br>
 
−  Ellison, Christopher (cellison@cse.ucdavis.edu)<br>
 
−  Complexity Sciences Center, UC Davis
 
−  <br>
 
−  <br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forwardtime and reversetime representations and then integrating them into a single timesymmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropya familiar measure of organization in complex systemsis the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a timesymmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.
 
−  <br>
 
−  <br>
 
−  Links: [[http://arxiv.org/abs/0905.3587]]
 
−   
−  
 
−  '''Complexity Measures and Frustration'''
 
−  <br>
 
−  <br>
 
−  Feldman, David (dave@hornacek.coa.edu)<br>
 
−  College of the Atlantic
 
−  <br>
 
−  <br>
 
−  In this talk I will present some new results applying complexity
 
−  measures to frustrated systems, and I will also comment on some
 
−  frustrations I have about past and current work in complexity
 
−  measures. I will conclude with a number of open questions and ideas
 
−  for future research.
 
−   
−  I will begin with a quick review of the excess entropy/predictive
 
−  information and argue that it is a well understood and broadly
 
−  applicable measure of complexity that allows for a comparison of
 
−  information processing abilities among very different systems. The
 
−  vehicle for this comparison is the complexityentropy diagram, a
 
−  scatterplot of the entropy and excess entropy as model parameters are
 
−  varied. This allows for a direct comparison in terms of the
 
−  configurations' intrinsic information processing properties. To
 
−  illustrate this point, I will show complexityentropy diagrams for: 1D
 
−  and 2D Ising models, 1D Cellular Automata, the logistic map, an
 
−  ensemble of Markov chains, and an ensemble of epsilonmachines.
 
−   
−  I will then present some new work in which a local form of the 2D
 
−  excess entropy is calculated for a frustrated spin system. This
 
−  allows one to see how information and memory are shared unevenly
 
−  across the lattice as the system enters a glassy state. These results
 
−  show that localised information theoretic complexity measures can be
 
−  usefully applied to heterogeneous lattice systems. I will argue that
 
−  local complexity measures for higherdimensional and heterogeneous
 
−  systems is a particularly fruitful area for future research.
 
−   
−  Finally, I will conclude by remarking upon some of the areas of
 
−  complexitymeasure research that have been sources of frustration.
 
−  These include the persistent notions of a universal "complexity at
 
−  the edge of chaos," and the relative lack of applications of
 
−  complexity measures to empirical data and/or multidimensional systems.
 
−  These remarks are designed to provoke dialog and discussion about
 
−  interesting and fun areas for future research.
 
−  <br>
 
−  <br>
 
−  Links: [[File:afm.tri.5.pdf]] and [[File:CHAOEH184043106_1.pdf]]
 
−  
 
−  '''Complexity, Parallel Computation and Statistical Physics'''
 
−  <br>
 
−  <br>
 
−  Machta, Jon (machta@physics.umass.edu)
 
−  <br>
 
−  SFI & University of Massachusetts
 
−  <br>
 
−  <br>
 
−  Links: [[http://arxiv.org/abs/condmat/0510809]]
 
−  
 
−  '''Crypticity and Information Accessibility'''
 
−  <br><br>
 
−  Mahoney, John (jmahoney3@ucmerced.edu)<br>
 
−  UC Merced
 
−  <br>
 
−  <br>
 
−  We give a systematic expansion of the crypticitya recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of kcryptic processes and allows us to identify finitestate processes that have infinite crypticitythe internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite and infiniteorder cases. It turns out that kcrypticity is complementary to the Markovian finiteorder property that describes state information in processes. One application of these results is an efficient expansion of the excess entropythe mutual information between a process's infinite past and infinite futurethat is finite and exact for finiteorder cryptic processes.
 
−  <br>
 
−  <br>
 
−  Links: [[http://arxiv.org/abs/0905.4787]]
 
−   
−  
 
−   
−  '''Automatic Identification of InformationProcessing Structures in Cellular Automata'''
 
−  <br>
 
−  <br>
 
−  Mitchell, Melanie (mm@cs.pdx.edu)
 
−  <br>
 
−  SFI & Portland State University
 
−  <br>
 
−  <br>
 
−  Cellular automata have been widely used as idealized models of natural spatiallyextended dynamical systems. An open question is how to best understand such systems in terms of their informationprocessing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one and twodimensional cellular automata that have been designed to perform the socalled density (or majority) classification task.
 
−  
 
−  '''Phase Transitions and Computational Complexity'''
 
−  <br>
 
−  <br>
 
−  Moore, Cris (moore@cs.unm.edu)
 
−  <br>
 
−  SFI & University of New Mexico
 
−  <br>
 
−  <br>
 
−  Links:
 
−   
−  
 
−  '''Statistical Mechanics of Interactive Learning'''
 
−  <br>
 
−  <br>
 
−  Still, Suzanne (sstill@hawaii.edu)<br>
 
−  University of Hawaii at Manoa
 
−  <br><br>
 
−  Links:
 
−  
 
−  '''Measuring the Complexity of Psychological States'''
 
−  <br>
 
−  <br>
 
−  Tononi, Guilio (gtononi@wisc.edu)<br>
 
−  University of Michigan
 
−  <br><br>
 
−  Links:
 
−  
 
−  '''Ergodic Parameters and Dynamical Complexity'''
 
−  <br>
 
−  <br>
 
−  VilelaMendes, Rui (vilela@cii.fc.ul.pt)
 
−  <br>
 
−  University of Lisbon
 
−  <br>
 
−  <br>
 
−  Using a cocycle formulation, old and new ergodic parameters beyond the
 
−  Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies
 
−  and fluctuations of the local expansion rate are related by a generalization
 
−  of the Pesin formula.
 
−  How the ergodic parameters may be used to characterize the complexity of
 
−  dynamical systems is illustrated by some examples: Clustering and
 
−  synchronization, selforganized criticality and the topological structure of
 
−  networks.
 
−  <br>
 
−  <br>
 
−  Links: [[http://arxiv.org/abs/1008.2664]]
 
−  
 
−  '''Hidden Quantum Markov Models and Nonadaptive Readout of Manybody States'''
 
−  <br>
 
−  <br>
 
−  Wiesner, Karoline (k.wiesner@bristol.ac.uk)
 
−  <br>
 
−  University of Bristol
 
−  <br>
 
−  <br>
 
−  Links: [[http://arxiv.org/abs/1002.2337]]
 
−  
 