

Line 1: 
Line 1: 
 {{Randomness, Structure and Causality}}   {{Randomness, Structure and Causality}} 

 

 
 == Abstracts ==
 

 
 <br>
 
 
 
 '''Effective Complexity of Stationary Process Realizations'''
 
 <br>
 
 <br>
 
 Ay, Nihat (nay@mis.mpg.de)
 
 <br>
 
 SFI & Max Planck Institute
 
 <br>
 
 <br>
 
 The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by GellMann and Lloyd. Based on their work we gave a precise definition of effective complexity of finite binary strings in terms of algorithmic information theory in our previous paper. Here we study the effective complexity of strings generated by stationary processes. Sufficiently long typical process realizations turn out to be effectively simple under any linear scaling with the string's length of the parameter $\Delta$ which determines the minimization domain. For a class of computable ergodic processes including i.i.d. and ergodic Markovian processes a stronger result can be shown: There exist sublinear scalings of $\Delta$ for which typical realizations turn out to be effectively simple. Our results become most transparent in the context of coarse effective complexity a modification of plain effective complexity, where $\Delta$ appears as a minimization argument. A similar modification of the closely related concept of sophistication has been invented by Antunes and Fortnow as coarse sophistication.
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/1001.2686]]
 
 
 
 '''Learning Out of Equilibrium'''
 
 <br>
 
 Bell, Tony (tony@salk.edu)
 
 <br>
 
 UC Berkeley
 
 <br>
 
 <br>
 
 Inspired by new results in nonequilibrium statistical mechanics, we define a new kind of statemachine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, secondorderintime stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new timesymmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing secondorderintime learning with timesymmetric likelihoods. A model is proposed, based on parameterised energybased Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecularlevel energybased selforganisation.
 
 <br>
 
 <br>
 
 Links:
 
 
 
 '''The Transmission of Sense Information'''
 
 <br>
 
 <br>
 
 Bergstrom, Carl (cbergst@u.washington.edu)
 
 <br>
 
 SFI & University of Washington
 
 <br>
 
 <br>
 
 Biologists rely heavily on the language of information, coding, and transmission that is commonplace in the field of information theory developed by Claude Shannon, but there is open debate about whether such language is anything more than facile metaphor. Philosophers of biology have argued that when biologists talk about information in genes and in evolution, they are not talking about the sort of information that Shannon’s theory addresses. First, philosophers have suggested that Shannon’s theory is only useful for developing a shallow notion of correlation, the socalled ‘‘causal sense’’ of information. Second, they typically argue that in genetics and evolutionary biology, infor mation language is used in a ‘‘semantic sense,’’ whereas semantics are deliber ately omitted from Shannon’s theory. Neither critique is wellfounded. Here we propose an alternative to the causal and semantic senses of information: a transmission sense of information, in which an object X conveys information if the function of X is to reduce, by virtue of its sequence properties, uncertainty on the part of an agent who observes X. The transmission sense not only captures much of what biologists intend when they talk about information in genes, but also brings Shannon’s theory back to the fore. By taking the view point of a communications engineer and focusing on the decision problem of how information is to be packaged for transport, this approach resolves several problems that have plagued the information concept in biology, and highlights a number of important features of the way that information is encoded, stored, and transmitted as genetic sequence.
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/0810.4168]]
 
 
 
 '''Optimizing Information Flow in Small Genetic Networks'''
 
 <br>
 
 <br>
 
 Bialek, William (wbialek@Princeton.EDU)
 
 <br>
 
 Princeton University
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/0912.5500]]
 
 
 
 '''To a Mathematical Theory of Evolution and Biological Creativity'''
 
 <br>
 
 <br>
 
 Chaitin, Gregory (gjchaitin@gmail.com)
 
 <br>
 
 IBM Watson Research Center
 
 <br>
 
 <br>
 
 We present an informationtheoretic analysis of Darwin’s theory of evolution, modeled as a hillclimbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.
 
 <br>
 
 <br>
 
 Links: [[File:Darwin.pdf]]
 
 
 
 '''Framing Complexity'''
 
 <br>
 
 <br>
 
 Crutchfield, James (chaos@cse.ucdavis.edu)<br>
 
 SFI & UC Davis
 
 <br>
 
 <br>
 
 Is there a theory of complex systems? And who should care, anyway?
 
 <br>
 
 <br>
 
 Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]
 

 
 
 
 '''The Vocabulary of GrammarBased Codes and the Logical Consistency of Texts'''
 
 <br>
 

 
 Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br>
 
 Polish Academy of Sciences<br>
 
 <br>
 
 <p>
 
 We will present a new explanation for the distribution of words in
 
 natural language which is grounded in information theory and inspired
 
 by recent research in excess entropy. Namely, we will demonstrate a
 
 theorem with the following informal statement: If a text of length <math>n</math>
 
 describes <math>n^\beta</math> independent facts in a repetitive way then the
 
 text contains at least <math>n^\beta/\log n</math> different words. In the
 
 formal statement, two modeling postulates are adopted. Firstly, the
 
 words are understood as nonterminal symbols of the shortest
 
 grammarbased encoding of the text. Secondly, the text is assumed to
 
 be emitted by a finiteenergy strongly nonergodic source whereas the
 
 facts are binary IID variables predictable in a shiftinvariant
 
 way. Besides the theorem, we will exhibit a few stochastic processes
 
 to which this and similar statements can be related.
 
 <br>
 
 <br>
 

 
 Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]
 

 
 
 
 '''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''
 
 <br>
 
 <br>
 
 Ellison, Christopher (cellison@cse.ucdavis.edu)<br>
 
 Complexity Sciences Center, UC Davis
 
 <br>
 
 <br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forwardtime and reversetime representations and then integrating them into a single timesymmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropya familiar measure of organization in complex systemsis the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a timesymmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/0905.3587]]
 

 
 
 
 '''Complexity Measures and Frustration'''
 
 <br>
 
 <br>
 
 Feldman, David (dave@hornacek.coa.edu)<br>
 
 College of the Atlantic
 
 <br>
 
 <br>
 
 In this talk I will present some new results applying complexity
 
 measures to frustrated systems, and I will also comment on some
 
 frustrations I have about past and current work in complexity
 
 measures. I will conclude with a number of open questions and ideas
 
 for future research.
 

 
 I will begin with a quick review of the excess entropy/predictive
 
 information and argue that it is a well understood and broadly
 
 applicable measure of complexity that allows for a comparison of
 
 information processing abilities among very different systems. The
 
 vehicle for this comparison is the complexityentropy diagram, a
 
 scatterplot of the entropy and excess entropy as model parameters are
 
 varied. This allows for a direct comparison in terms of the
 
 configurations' intrinsic information processing properties. To
 
 illustrate this point, I will show complexityentropy diagrams for: 1D
 
 and 2D Ising models, 1D Cellular Automata, the logistic map, an
 
 ensemble of Markov chains, and an ensemble of epsilonmachines.
 

 
 I will then present some new work in which a local form of the 2D
 
 excess entropy is calculated for a frustrated spin system. This
 
 allows one to see how information and memory are shared unevenly
 
 across the lattice as the system enters a glassy state. These results
 
 show that localised information theoretic complexity measures can be
 
 usefully applied to heterogeneous lattice systems. I will argue that
 
 local complexity measures for higherdimensional and heterogeneous
 
 systems is a particularly fruitful area for future research.
 

 
 Finally, I will conclude by remarking upon some of the areas of
 
 complexitymeasure research that have been sources of frustration.
 
 These include the persistent notions of a universal "complexity at
 
 the edge of chaos," and the relative lack of applications of
 
 complexity measures to empirical data and/or multidimensional systems.
 
 These remarks are designed to provoke dialog and discussion about
 
 interesting and fun areas for future research.
 
 <br>
 
 <br>
 
 Links: [[File:afm.tri.5.pdf]] and [[File:CHAOEH184043106_1.pdf]]
 
 
 
 '''Complexity, Parallel Computation and Statistical Physics'''
 
 <br>
 
 <br>
 
 Machta, Jon (machta@physics.umass.edu)
 
 <br>
 
 SFI & University of Massachusetts
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/condmat/0510809]]
 
 
 
 '''Crypticity and Information Accessibility'''
 
 <br><br>
 
 Mahoney, John (jmahoney3@ucmerced.edu)<br>
 
 UC Merced
 
 <br>
 
 <br>
 
 We give a systematic expansion of the crypticitya recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of kcryptic processes and allows us to identify finitestate processes that have infinite crypticitythe internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite and infiniteorder cases. It turns out that kcrypticity is complementary to the Markovian finiteorder property that describes state information in processes. One application of these results is an efficient expansion of the excess entropythe mutual information between a process's infinite past and infinite futurethat is finite and exact for finiteorder cryptic processes.
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/0905.4787]]
 

 
 
 

 
 '''Automatic Identification of InformationProcessing Structures in Cellular Automata'''
 
 <br>
 
 <br>
 
 Mitchell, Melanie (mm@cs.pdx.edu)
 
 <br>
 
 SFI & Portland State University
 
 <br>
 
 <br>
 
 Cellular automata have been widely used as idealized models of natural spatiallyextended dynamical systems. An open question is how to best understand such systems in terms of their informationprocessing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one and twodimensional cellular automata that have been designed to perform the socalled density (or majority) classification task.
 
 
 
 '''Phase Transitions and Computational Complexity'''
 
 <br>
 
 <br>
 
 Moore, Cris (moore@cs.unm.edu)
 
 <br>
 
 SFI & University of New Mexico
 
 <br>
 
 <br>
 
 We study EC3, a variant of Exact Cover which is equivalent to Positive 1in3 SAT. Random instances of EC3 were recently used as benchmarks for simulations of an adiabatic quantum algorithm. Empirical results suggest that EC3 has a phase transition from satisfiability to unsatisfiability when the number of clauses per variable r exceeds some threshold r* ~= 0.62 + 0.01. Using the method of differential equations, we show that if r <= 0.546 w.h.p. a random instance of EC3 is satisfiable. Combined with previous results this limits the location of the threshold, if it exists, to the range 0.546 < r* < 0.644.
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/cs/0508037]]
 

 
 
 
 '''Statistical Mechanics of Interactive Learning'''
 
 <br>
 
 <br>
 
 Still, Suzanne (sstill@hawaii.edu)<br>
 
 University of Hawaii at Manoa
 
 <br>
 
 <br>
 
 The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model and decisionmaking into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this tradeoff between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/0709.1948]]
 
 
 
 '''Measuring the Complexity of Psychological States'''
 
 <br>
 
 <br>
 
 Tononi, Guilio (gtononi@wisc.edu)<br>
 
 University of Michigan
 
 <br><br>
 
 Links:
 
 
 
 '''Ergodic Parameters and Dynamical Complexity'''
 
 <br>
 
 <br>
 
 VilelaMendes, Rui (vilela@cii.fc.ul.pt)
 
 <br>
 
 University of Lisbon
 
 <br>
 
 <br>
 
 Using a cocycle formulation, old and new ergodic parameters beyond the
 
 Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies
 
 and fluctuations of the local expansion rate are related by a generalization
 
 of the Pesin formula.
 
 How the ergodic parameters may be used to characterize the complexity of
 
 dynamical systems is illustrated by some examples: Clustering and
 
 synchronization, selforganized criticality and the topological structure of
 
 networks.
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/1008.2664]]
 
 
 
 '''Hidden Quantum Markov Models and Nonadaptive Readout of Manybody States'''
 
 <br>
 
 <br>
 
 Wiesner, Karoline (k.wiesner@bristol.ac.uk)
 
 <br>
 
 University of Bristol
 
 <br>
 
 <br>
 
 Stochastic finitestate generators are compressed descriptions of infinite time series. Alternatively, compressed descriptions are given by quantum finite state generators [K. Wiesner and J. P. Crutchfield, Physica D 237, 1173 (2008)]. These are based on repeated von Neumann measurements on a quantum dynamical system. Here we generalise the quantum finitestate generators by replacing the von Neumann pro jections by stochastic quantum operations. In this way we assure that any time series with a stochastic compressed description has a compressed quantum description. Moreover, we establish a link between our stochastic generators and the sequential readout of manybody states with translationallyinvariant matrix product state representations. As an example, we consider the nonadaptive readout of 1D cluster states. This is shown to be equivalent to a Hidden Quantum Model with two internal states, providing insight on the inherent complexity of the process. Finally, it is proven by example that the quantum description can have a higher degree of compression than the classical stochastic one.
 
 <br>
 
 <br>
 
 Links: [[http://arxiv.org/abs/1002.2337]]
 
 
 