https://wiki.santafe.edu/api.php?action=feedcontributions&user=Jmachta&feedformat=atomSanta Fe Institute Events Wiki - User contributions [en]2021-06-19T19:18:24ZUser contributionsMediaWiki 1.32.0https://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=39010Randomness, Structure and Causality - Abstracts2011-01-07T17:39:41Z<p>Jmachta: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''Effective Complexity of Stationary Process Realizations'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. Based on their work we gave a precise definition of effective complexity of finite binary strings in terms of algorithmic information theory in our previous paper. Here we study the effective complexity of strings generated by stationary processes. Sufficiently long typical process realizations turn out to be effectively simple under any linear scaling with the string's length of the parameter $\Delta$ which determines the minimization domain. For a class of computable ergodic processes including i.i.d. and ergodic Markovian processes a stronger result can be shown: There exist sublinear scalings of $\Delta$ for which typical realizations turn out to be effectively simple. Our results become most transparent in the context of coarse effective complexity --a modification of plain effective complexity, where $\Delta$ appears as a minimization argument. A similar modification of the closely related concept of sophistication has been invented by Antunes and Fortnow as coarse sophistication.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Information Aggregation in Correlated Complex Systems and Optimal Estimation'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
Information is a peculiar quantity. Unlike matter and energy, which are conserved by the laws of physics, the aggregation of knowledge from many sources can in fact produce more information (synergy) or less (redundancy) than the sum of its parts, provided these sources are correlated. I discuss how the formal properties of information aggregation - expressed in information theoretic terms - provide a general window for explaining features of organization in several complex systems. I show under what circumstances collective coordination may pay off in stochastic search problems, how this can be used to estimate functional relations between neurons in living neural tissue and more generally how it may have implications for other network structures in social and biological systems.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity'''<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
In this talk I argue that a fundamental measure of physical complexity is obtained from the parallel computational complexity of sampling states of the system. After motivating this idea, I will briefly review relevant aspects of computational complexity theory, discuss the properties of the proposed measure of physical complexity and illustrate the ideas with some examples from statistical physics. <br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Dominos, Ergodic Flows'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We present a model, developed with Norman Packard, of a simple discrete open flow system. Dimers are created at one edge of a two-dimensional lattice, diffuse across, and are removed at the opposite side. A steady-state flow is established, under various kinetic rules. In the equilibrium case, the system reduces to the classical monomer-dimer tiling problem, whose entropy as a function of density is known. This entropy density is reproduced locally in the flow system, as shown by statistics over local templates. The goal is to clarify informational aspects of a flowing pattern.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity'''<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk)<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Jmachtahttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=38926Randomness, Structure and Causality - Abstracts2011-01-05T00:51:26Z<p>Jmachta: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''Effective Complexity of Stationary Process Realizations'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. Based on their work we gave a precise definition of effective complexity of finite binary strings in terms of algorithmic information theory in our previous paper. Here we study the effective complexity of strings generated by stationary processes. Sufficiently long typical process realizations turn out to be effectively simple under any linear scaling with the string's length of the parameter $\Delta$ which determines the minimization domain. For a class of computable ergodic processes including i.i.d. and ergodic Markovian processes a stronger result can be shown: There exist sublinear scalings of $\Delta$ for which typical realizations turn out to be effectively simple. Our results become most transparent in the context of coarse effective complexity --a modification of plain effective complexity, where $\Delta$ appears as a minimization argument. A similar modification of the closely related concept of sophistication has been invented by Antunes and Fortnow as coarse sophistication.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Identification of Functional Information Subgraphs in Complex Networks'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
We present a general information theoretic approach for identifying functional subgraphs in complex networks. We show that the uncertainty in a variable can be written as a sum of information quantities, where each term is generated by successively conditioning mutual informations on new measured variables in a way analogous to a discrete differential calculus. The analogy to a Taylor series suggests efficient optimization algorithms for determining the state of a target variable in terms of functional groups of other nodes. We apply this methodology to electrophysiological recordings of cortical neuronal networks grown in vitro. Each cell’s firing is generally explained by the activity of a few neurons. We identify these neuronal subgraphs in terms of their redundant or synergetic character and reconstruct neuronal circuits that account for the state of target cells.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity'''<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Thermodynamics and Local Complexity of Domino Gases'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We determine the local Shannon-Boltzmann entropy for a domino gas model by exactly counting configurations. This allows us to compute the entropy from a local template, comparing it with known classical values for monomer-dimer tilings. We then study the diffusion of heat through the system via a source of dominos at one end and a sink at the other. In this setting we estimate the density gradient of the nonequilibrium steady state, using various statistics to measure a macroscopic "conductivity".<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Measuring the Complexity of Psychological States'''<br />
<br><br />
<br><br />
Tononi, Guilio (gtononi@wisc.edu)<br><br />
University of Wisconsin<br />
<br><br><br />
Links:<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity'''<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk)<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Jmachtahttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Participants&diff=38925Randomness, Structure and Causality - Participants2011-01-05T00:50:44Z<p>Jmachta: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<table border="1"><br />
<tr><br />
<th>Name</th><br />
<th>Email</th><br />
<th>Institution</th><br />
<th>Talk</th><br />
<th>Paper</th><br />
</tr><br />
<tr><br />
<td>*Ay, Nihat</td><br />
<td>nay@mis.mpg.de</td> <br />
<td>SFI & Max Planck Institute</td><br />
<td>Effective Complexity of Stationary Process Realizations</td><br />
<td>[[http://arxiv.org/abs/1001.2686]]</td><br />
</tr><br />
<tr><br />
<td>*Bell, Tony</td><br />
<td>tony@salk.edu</td><br />
<td>UC Berkeley</td><br />
<td>Learning Out of Equilibrium</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Bettencourt, Luis</td><br />
<td>lmbettencourt@gmail.com</td><br />
<td>SFI & LANL</td><br />
<td>Identification of Functional Information Subgraphs in Complex Networks</td><br />
<td>[[http://arxiv.org/abs/0712.2218]]</td><br />
</tr><br />
<tr><br />
<td>*Chaitin, Gregory</td><br />
<td>gjchaitin@gmail.com</td><br />
<td>IBM Watson Research Center</td><br />
<td>To a Mathematical Theory of Evolution and Biological Creativity</td><br />
<td>[[Media:Darwin.pdf| Paper]]</td><br />
</tr><br />
<tr> <br />
<td>*Crutchfield, James</td><br />
<td>chaos@cse.ucdavis.edu</td><br />
<td>SFI & UC Davis</td><br />
<td>Framing Complexity</td><br />
<td>[[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]</td><br />
</tr><br />
<tr><br />
<td>*Debowski, Lukasz</td><br />
<td>ldebowsk@ipipan.waw.pl</td><br />
<td>Polish Academy of Sciences</td><br />
<td>The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts</td><br />
<td>[[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]</td><br />
</tr><br />
<tr><br />
<td>*Ellison, Christopher</td><br />
<td>cellison@cse.ucdavis.edu</td><br />
<td>UC Davis</td><br />
<td>Prediction, Retrodiction, and the Amount of Information Stored in the Present</td><br />
<td>[[http://arxiv.org/abs/0905.3587]]</td><br />
</tr><br />
<tr><br />
<td>*Feldman, David</td><br />
<td>dave@hornacek.coa.edu</td><br />
<td>College of the Atlantic</td><br />
<td>Complexity Measures and Frustration</td><br />
<td>[[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]] </td><br />
</tr><br />
<tr><br />
<td>*Mahoney, John</td><br />
<td>jmahoney3@ucmerced.edu</td><br />
<td>UC Merced</td><br />
<td>Crypticity and Information Accessibility</td><br />
<td>[[http://arxiv.org/abs/0905.4787]]</td><br />
</tr><br />
<tr><br />
<td>*Machta, Jon</td><br />
<td>machta@physics.umass.edu</td><br />
<td>SFI & University of Massachusetts</td><br />
<td>Complexity, Parallel Computation and Statistical Physics</td><br />
<td>[[http://arxiv.org/abs/cond-mat/0510809]]</td><br />
</tr><br />
<tr><br />
<td>*Mitchell, Melanie</td><br />
<td>mm@cs.pdx.edu</td><br />
<td>SFI & Portland State University</td><br />
<td>Automatic Identification of Information-Processing Structures in Cellular Automata</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Moore, Cris</td><br />
<td>moore@cs.unm</td><br />
<td>SFI & University of New Mexico</td><br />
<td>Phase Transitions and Computational Complexity</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Shaw, Robert Stetson</td><br />
<td>rob@protolife.net</td><br />
<td>ProtoLife, Inc.</td><br />
<td>Thermodynamics and Local Complexity of Domino Gases</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Still, Suzanne</td><br />
<td>sstill@hawaii.edu</td><br />
<td>University of Hawaii at Manoa</td><br />
<td>Statistical Mechanics of Interactive Learning</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>Tononi, Guilio</td><br />
<td>gtononi@wisc.edu</td><br />
<td>University of Wisconsin</td><br />
<td>Measuring the Complexity of Psychological States </td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Vilela-Mendes, Rui</td><br />
<td>vilela@cii.fc.ul.pt</td><br />
<td>University of Lisbon</td><br />
<td>Ergodic Parameters and Dynamical Complexity</td><br />
<td>[[http://arxiv.org/abs/1008.2664]]</td><br />
</tr><br />
<tr><br />
<td>Trabesinger, Andreas</td><br />
<td>a.trabesinger@nature.com</td><br />
<td>Nature Physics Magazine</td><br />
<td>Measuring Complexity?</td><br />
<td>[[http://www.nature.com/nphys/]]</td><br />
</tr><br />
<tr><br />
<td>*Wiesner, Karoline</td><br />
<td>k.wiesner@bristol.ac.uk</td><br />
<td>University of Bristol</td><br />
<td>Hidden Quantum Markov Models and Non-adaptive Read-out of Many-body States</td><br />
<td>[[http://arxiv.org/abs/1002.2337]]</td><br />
</tr><br />
</table><br />
<nowiki>*Confirmed</nowiki></div>Jmachtahttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Abstracts&diff=38923Randomness, Structure and Causality - Abstracts2011-01-04T18:20:28Z<p>Jmachta: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
<br />
<br><br />
----<br />
'''Effective Complexity of Stationary Process Realizations'''<br />
<br><br />
<br><br />
Ay, Nihat (nay@mis.mpg.de)<br />
<br><br />
SFI & Max Planck Institute<br />
<br><br />
<br><br />
The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. Based on their work we gave a precise definition of effective complexity of finite binary strings in terms of algorithmic information theory in our previous paper. Here we study the effective complexity of strings generated by stationary processes. Sufficiently long typical process realizations turn out to be effectively simple under any linear scaling with the string's length of the parameter $\Delta$ which determines the minimization domain. For a class of computable ergodic processes including i.i.d. and ergodic Markovian processes a stronger result can be shown: There exist sublinear scalings of $\Delta$ for which typical realizations turn out to be effectively simple. Our results become most transparent in the context of coarse effective complexity --a modification of plain effective complexity, where $\Delta$ appears as a minimization argument. A similar modification of the closely related concept of sophistication has been invented by Antunes and Fortnow as coarse sophistication.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1001.2686]]<br />
----<br />
'''Learning Out of Equilibrium'''<br />
<br><br />
Bell, Tony (tony@salk.edu)<br />
<br><br />
UC Berkeley<br />
<br><br />
<br><br />
Inspired by new results in non-equilibrium statistical mechanics, we define a new kind of state-machine that can be used to model time series. The machine is deterministically coupled to the inputs unlike stochastic generative models like the Kalman filter and HMM’s. The likelihood in this case is shown to be a sum of local time likelihoods. We introduce a new concept, second-order-in-time stochastic gradient, which derives from the time derivative of the likelihood, showing that the latter decomposes into a ‘work’ term, a ‘heat’ term and a term describing time asymmetry in the state machine’s dynamics. This motivates the introduction of a new time-symmetric likelihood function for time series. Our central result is that the time derivative of this is an average sum of forward and backward time ‘work’ terms, in which all partition functions, which plague Dynamic Bayesian Networks, have cancelled out. We can now do tractable time series density estimation with arbitrary models, without sampling. This is a direct result of doing second-order-in-time learning with time-symmetric likelihoods. A model is proposed, based on parameterised energy-based Markovian kinetics, with the goal of learning (bio)chemical networks from data, and taking a step towards understanding molecular-level energy-based self-organisation.<br />
<br><br />
<br><br />
Links:<br />
----<br />
'''Identification of Functional Information Subgraphs in Complex Networks'''<br />
<br><br />
<br><br />
Bettencourt, Luis (lmbettencourt@gmail.com)<br />
<br><br />
SFI & LANL<br />
<br><br />
<br><br />
We present a general information theoretic approach for identifying functional subgraphs in complex networks. We show that the uncertainty in a variable can be written as a sum of information quantities, where each term is generated by successively conditioning mutual informations on new measured variables in a way analogous to a discrete differential calculus. The analogy to a Taylor series suggests efficient optimization algorithms for determining the state of a target variable in terms of functional groups of other nodes. We apply this methodology to electrophysiological recordings of cortical neuronal networks grown in vitro. Each cell’s firing is generally explained by the activity of a few neurons. We identify these neuronal subgraphs in terms of their redundant or synergetic character and reconstruct neuronal circuits that account for the state of target cells.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0712.2218]]<br />
----<br />
'''To a Mathematical Theory of Evolution and Biological Creativity'''<br />
<br><br />
<br><br />
Chaitin, Gregory (gjchaitin@gmail.com)<br />
<br><br />
IBM Watson Research Center<br />
<br><br />
<br><br />
We present an information-theoretic analysis of Darwin’s theory of evolution, modeled as a hill-climbing algorithm on a fitness landscape. Our space of possible organisms consists of computer programs, which are subjected to random mutations. We study the random walk of increasing fitness made by a single mutating organism. In two different models we are able to show that evolution will occur and to characterize the rate of evolutionary progress, i.e., the rate of biological creativity.<br />
<br><br />
<br><br />
Links: [[Media:Darwin.pdf| Paper]]<br />
----<br />
'''Framing Complexity'''<br />
<br><br />
<br><br />
Crutchfield, James (chaos@cse.ucdavis.edu)<br><br />
SFI & UC Davis<br />
<br><br />
<br><br />
Is there a theory of complex systems? And who should care, anyway?<br />
<br><br />
<br><br />
Links: [[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]<br />
<br />
----<br />
'''The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts'''<br />
<br><br />
<br />
Debowski, Lukasz (ldebowsk@ipipan.waw.pl)<br><br />
Polish Academy of Sciences<br><br />
<br><br />
<p><br />
We will present a new explanation for the distribution of words in<br />
natural language which is grounded in information theory and inspired<br />
by recent research in excess entropy. Namely, we will demonstrate a<br />
theorem with the following informal statement: If a text of length <math>n</math><br />
describes <math>n^\beta</math> independent facts in a repetitive way then the<br />
text contains at least <math>n^\beta/\log n</math> different words. In the<br />
formal statement, two modeling postulates are adopted. Firstly, the<br />
words are understood as nonterminal symbols of the shortest<br />
grammar-based encoding of the text. Secondly, the text is assumed to<br />
be emitted by a finite-energy strongly nonergodic source whereas the<br />
facts are binary IID variables predictable in a shift-invariant<br />
way. Besides the theorem, we will exhibit a few stochastic processes<br />
to which this and similar statements can be related.<br />
<br><br />
<br><br />
<br />
Links: [[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]<br />
<br />
----<br />
'''Prediction, Retrodiction, and the Amount of Information Stored in the Present'''<br />
<br><br />
<br><br />
Ellison, Christopher (cellison@cse.ucdavis.edu)<br><br />
Complexity Sciences Center, UC Davis<br />
<br><br />
<br>We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy--a familiar measure of organization in complex systems--is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system invariants for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.3587]]<br />
<br />
----<br />
'''Complexity Measures and Frustration'''<br />
<br><br />
<br><br />
Feldman, David (dave@hornacek.coa.edu)<br><br />
College of the Atlantic<br />
<br><br />
<br><br />
In this talk I will present some new results applying complexity<br />
measures to frustrated systems, and I will also comment on some<br />
frustrations I have about past and current work in complexity<br />
measures. I will conclude with a number of open questions and ideas<br />
for future research.<br />
<br />
I will begin with a quick review of the excess entropy/predictive<br />
information and argue that it is a well understood and broadly<br />
applicable measure of complexity that allows for a comparison of<br />
information processing abilities among very different systems. The<br />
vehicle for this comparison is the complexity-entropy diagram, a<br />
scatter-plot of the entropy and excess entropy as model parameters are<br />
varied. This allows for a direct comparison in terms of the<br />
configurations' intrinsic information processing properties. To<br />
illustrate this point, I will show complexity-entropy diagrams for: 1D<br />
and 2D Ising models, 1D Cellular Automata, the logistic map, an<br />
ensemble of Markov chains, and an ensemble of epsilon-machines.<br />
<br />
I will then present some new work in which a local form of the 2D<br />
excess entropy is calculated for a frustrated spin system. This<br />
allows one to see how information and memory are shared unevenly<br />
across the lattice as the system enters a glassy state. These results<br />
show that localised information theoretic complexity measures can be<br />
usefully applied to heterogeneous lattice systems. I will argue that<br />
local complexity measures for higher-dimensional and heterogeneous<br />
systems is a particularly fruitful area for future research.<br />
<br />
Finally, I will conclude by remarking upon some of the areas of<br />
complexity-measure research that have been sources of frustration.<br />
These include the persistent notions of a universal "complexity at<br />
the edge of chaos," and the relative lack of applications of<br />
complexity measures to empirical data and/or multidimensional systems.<br />
These remarks are designed to provoke dialog and discussion about<br />
interesting and fun areas for future research.<br />
<br><br />
<br><br />
Links: [[Media:afm.tri.5.pdf| Paper 1]] and [[Media:CHAOEH184043106_1.pdf| Paper 2]]<br />
----<br />
'''Complexity, Parallel Computation and Statistical Physics'''<br />
<br><br />
<br><br />
Machta, Jon (machta@physics.umass.edu)<br />
<br><br />
SFI & University of Massachusetts<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/cond-mat/0510809]]<br />
----<br />
'''Crypticity and Information Accessibility'''<br />
<br><br><br />
Mahoney, John (jmahoney3@ucmerced.edu)<br><br />
UC Merced<br />
<br><br />
<br><br />
We give a systematic expansion of the crypticity--a recently introduced measure of the inaccessibility of a stationary process's internal state information. This leads to a hierarchy of k-cryptic processes and allows us to identify finite-state processes that have infinite crypticity--the internal state information is present across arbitrarily long, observed sequences. The crypticity expansion is exact in both the finite- and infinite-order cases. It turns out that k-crypticity is complementary to the Markovian finite-order property that describes state information in processes. One application of these results is an efficient expansion of the excess entropy--the mutual information between a process's infinite past and infinite future--that is finite and exact for finite-order cryptic processes.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0905.4787]]<br />
<br />
----<br />
<br />
'''Automatic Identification of Information-Processing Structures in Cellular Automata'''<br />
<br><br />
<br><br />
Mitchell, Melanie (mm@cs.pdx.edu)<br />
<br><br />
SFI & Portland State University<br />
<br><br />
<br><br />
Cellular automata have been widely used as idealized models of natural spatially-extended dynamical systems. An open question is how to best understand such systems in terms of their information-processing capabilities. In this talk we address this question by describing several approaches to automatically identifying the structures underlying information processing in cellular automata. In particular, we review the computational mechanics methods of Crutchfield et al., the local sensitivity and local statistical complexity filters proposed by Shalizi et al., and the information theoretic filters proposed by Lizier et al. We illustrate these methods by applying them to several one- and two-dimensional cellular automata that have been designed to perform the so-called density (or majority) classification task.<br />
----<br />
'''Phase Transitions and Computational Complexity'''<br />
<br><br />
<br><br />
Moore, Cris (moore@cs.unm.edu)<br />
<br><br />
SFI & University of New Mexico<br />
<br><br />
<br><br />
A review and commentary on the fundamental concepts of computational complexity, beyond the usual discussion of P, NP and NP-completeness, in an attempt to explain the deep meaning of the P vs. NP question. I'll discuss counting, randomized algorithms, and higher complexity classes, and several topics that are current hotbeds of interdisciplinary research, like phase transitions in computation, Monte Carlo algorithms, and quantum computing.<br />
<br><br />
<br><br />
Links: [[http://www-e.uni-magdeburg.de/mertens/publications/cise.pdf]] and [[http://www.nature-of-computation.org/]]<br />
<br />
----<br />
'''Thermodynamics and Local Complexity of Domino Gases'''<br />
<br><br />
<br><br />
Shaw, Rob (rob@protolife.net)<br><br />
ProtoLife, Inc.<br />
<br><br />
<br><br />
We determine the local Shannon-Boltzmann entropy for a domino gas model by exactly counting configurations. This allows us to compute the entropy from a local template, comparing it with known classical values for monomer-dimer tilings. We then study the diffusion of heat through the system via a source of dominos at one end and a sink at the other. In this setting we estimate the density gradient of the nonequilibrium steady state, using various statistics to measure a macroscopic "conductivity".<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1002.0344]]<br />
----<br />
'''Statistical Mechanics of Interactive Learning'''<br />
<br><br />
<br><br />
Still, Suzanne (sstill@hawaii.edu)<br><br />
University of Hawaii at Manoa<br />
<br><br />
<br><br />
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer’s world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process’s causal organization in the presence of the learner’s actions. A fundamental consequence of the proposed principle is that the learner’s optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/0709.1948]]<br />
----<br />
'''Measuring the Complexity of Psychological States'''<br />
<br><br />
<br><br />
Tononi, Guilio (gtononi@wisc.edu)<br><br />
University of Michigan<br />
<br><br><br />
Links:<br />
----<br />
'''Ergodic Parameters and Dynamical Complexity'''<br />
<br><br />
<br><br />
Vilela-Mendes, Rui (vilela@cii.fc.ul.pt)<br />
<br><br />
University of Lisbon<br />
<br><br />
<br><br />
Using a cocycle formulation, old and new ergodic parameters beyond the <br />
Lyapunov exponent are rigorously characterized. Dynamical Renyi entropies <br />
and fluctuations of the local expansion rate are related by a generalization <br />
of the Pesin formula.<br />
How the ergodic parameters may be used to characterize the complexity of <br />
dynamical systems is illustrated by some examples: Clustering and <br />
synchronization, self-organized criticality and the topological structure of <br />
networks.<br />
<br><br />
<br><br />
Links: [[http://arxiv.org/abs/1008.2664]]<br />
----<br />
'''Quantum Statistical Complexity -- Sharpening Occam's Razor with Quantum Mechanics'''<br />
<br><br />
<br><br />
Wiesner, Karoline (k.wiesner@bristol.ac.uk)<br />
<br><br />
University of Bristol<br />
<br><br />
<br><br />
Mathematical models are an essential component of quantitative science. They generate predictions about the future, based on information available in the present. In the spirit of Occam’s razor, simpler is better; should two models make identical predictions, the one that requires less input is preferred. This is the basis of causal-state models. The amount of information required for optimal prediction is the statistical complexity. We systematically construct quantum models that require less information for optimal prediction than the classical models do. This indicates that the system of minimal entropy that exhibits such statistics must necessarily feature quantum dynamics, and that certain phenomena could be significantly simpler than classically possible should quantum effects be involved.<br />
<br><br />
<br><br />
Links: (Section V of) [[http://link.aip.org/link/CHAOEH/v20/i3/p037114/s1&Agg=doi]]<br />
----</div>Jmachtahttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality&diff=38769Randomness, Structure and Causality2010-12-16T02:10:16Z<p>Jmachta: </p>
<hr />
<div><br />
{{Randomness, Structure and Causality}}<br />
<br />
<br />
== Organizers ==<br />
Jim Crutchfield (UC Davis)<br />
<br />
Jon Machta (UMass Amherst)<br />
<br />
----<br />
<br />
Workshop summary: [[File:Rsc.pdf]]</div>Jmachtahttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality&diff=38768Randomness, Structure and Causality2010-12-16T02:09:29Z<p>Jmachta: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
== Organizers ==<br />
Jim Crutchfield (UC Davis)<br />
<br />
Jon Machta (UMass Amherst)<br />
<br />
Workshop summary: [[File:Rsc.pdf]]</div>Jmachtahttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality&diff=38767Randomness, Structure and Causality2010-12-16T02:08:42Z<p>Jmachta: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<br />
== Organizers ==<br />
Jim Crutchfield (UC Davis)<br />
Jon Machta (UMass Amherst)<br />
<br />
Workshop summary: [[File:Rsc.pdf]]</div>Jmachtahttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Bios&diff=38766Randomness, Structure and Causality - Bios2010-12-16T01:52:37Z<p>Jmachta: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<table border="1"><br />
<tr><br />
<th>Name</th><br />
<th>Email</th><br />
<th>Institution</th><br />
<th>Talk</th><br />
<th>Paper</th><br />
</tr><br />
<tr><br />
<td>*Ay, Nihat</td><br />
<td>nay@mis.mpg.de</td> <br />
<td>SFI & Max Planck Institute</td><br />
<td>Effective Complexity of Stationary Process Realizations</td><br />
<td>[[http://arxiv.org/abs/1001.2686]]</td><br />
</tr><br />
<tr><br />
<td>*Bell, Tony</td><br />
<td>tony@salk.edu</td><br />
<td>UC Berkeley</td><br />
<td>Learning Out of Equilibrium</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>Bergstrom, Carl</td><br />
<td>cbergst@u.washington.edu</td><br />
<td>SFI & University of Washington</td><br />
<td>The Transmission Sense of Information</td><br />
<td>[[http://arxiv.org/abs/0810.4168]]</td><br />
</tr><br />
<tr><br />
<td>Bialek, William</td><br />
<td>wbialek@Princeton.EDU</td><br />
<td>Princeton University</td><br />
<td>Optimizing Information Flow in Small Genetic Networks</td><br />
<td>[[http://arxiv.org/abs/0912.5500]]</td><br />
</tr><br />
<tr><br />
<td>*Chaitin, Gregory</td><br />
<td>gjchaitin@gmail.com</td><br />
<td>IBM Watson Research Center</td><br />
<td>To a Mathematical Theory of Evolution and Biological Creativity</td><br />
<td>[[File:Darwin.pdf]]</td><br />
</tr><br />
<tr> <br />
<td>*Crutchfield, James</td><br />
<td>chaos@cse.ucdavis.edu</td><br />
<td>SFI & UC Davis</td><br />
<td>Framing Complexity</td><br />
<td>[[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]</td><br />
</tr><br />
<tr><br />
<td>*Debowski, Lukasz</td><br />
<td>ldebowsk@ipipan.waw.pl</td><br />
<td>Polish Academy of Sciences</td><br />
<td>The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts</td><br />
<td>[[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]</td><br />
</tr><br />
<tr><br />
<td>*Ellison, Christopher</td><br />
<td>cellison@cse.ucdavis.edu</td><br />
<td>UC Davis</td><br />
<td>Prediction, Retrodiction, and the Amount of Information Stored in the Present</td><br />
<td>[[http://arxiv.org/abs/0905.3587]]</td><br />
</tr><br />
<tr><br />
<td>*Feldman, David</td><br />
<td>dave@hornacek.coa.edu</td><br />
<td>College of the Atlantic</td><br />
<td>Spatial Information Theory</td><br />
<td>[[File:afm.tri.5.pdf]] and [[File:CHAOEH184043106_1.pdf]] </td><br />
</tr><br />
<tr><br />
<td>*Mahoney, John</td><br />
<td>jmahoney3@ucmerced.edu</td><br />
<td>UC Merced</td><br />
<td>Crypticity and Information Accessibility</td><br />
<td>[[http://arxiv.org/abs/0905.4787]]</td><br />
</tr><br />
<tr><br />
<td>*Machta, Jon</td><br />
<td>machta@physics.umass.edu</td><br />
<td>SFI & University of Massachusetts</td><br />
<td>Complexity, Parallel Computation and Statistical Physics</td><br />
<td>[[http://arxiv.org/abs/cond-mat/0510809]]</td><br />
</tr><br />
<tr><br />
<td>*Mitchell, Melanie</td><br />
<td>mm@cs.pdx.edu</td><br />
<td>SFI & Portland State University</td><br />
<td>Automatic Identification of Information-Processing Structures in Cellular Automata</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Still, Suzanne</td><br />
<td>sstill@hawaii.edu</td><br />
<td>University of Hawaii at Manoa</td><br />
<td>Statistical Mechanics of Interactive Learning</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>Tononi, Guilio</td><br />
<td>gtononi@wisc.edu</td><br />
<td>University of Michigan</td><br />
<td>Measuring the Complexity of Psychological States </td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Vilela-Mendes, Rui</td><br />
<td>vilela@cii.fc.ul.pt</td><br />
<td>University of Lisbon</td><br />
<td>Ergodic Parameters and Dynamical Complexity</td><br />
<td>[[http://arxiv.org/abs/1008.2664]]</td><br />
</tr><br />
<tr><br />
<td>Trabesinger, Andreas</td><br />
<td>a.trabesinger@nature.com</td><br />
<td>Nature Physics Magazine</td><br />
<td>Measuring Complexity?</td><br />
<td>[[http://www.nature.com/nphys/]]</td><br />
</tr><br />
<tr><br />
<td>*Wiesner, Karoline</td><br />
<td>k.wiesner@bristol.ac.uk</td><br />
<td>University of Bristol</td><br />
<td>Hidden Quantum Markov Models and Non-adaptive Read-out of Many-body States</td><br />
<td>[[http://arxiv.org/abs/1002.2337]]</td><br />
</tr><br />
</table><br />
<nowiki>*Confirmed</nowiki></div>Jmachtahttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Bios&diff=38765Randomness, Structure and Causality - Bios2010-12-16T01:50:52Z<p>Jmachta: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<table border="1"><br />
<tr><br />
<th>Name</th><br />
<th>Email</th><br />
<th>Institution</th><br />
<th>Talk</th><br />
<th>Paper</th><br />
</tr><br />
<tr><br />
<td>*Ay, Nihat</td><br />
<td>nay@mis.mpg.de</td> <br />
<td>SFI & Max Planck Institute</td><br />
<td>Effective Complexity of Stationary Process Realizations</td><br />
<td>[[http://arxiv.org/abs/1001.2686]]</td><br />
</tr><br />
<tr><br />
<td>*Bell, Tony</td><br />
<td>tony@salk.edu</td><br />
<td>UC Berkeley</td><br />
<td>Learning Out of Equilibrium</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>Bergstrom, Carl</td><br />
<td>cbergst@u.washington.edu</td><br />
<td>SFI & University of Washington</td><br />
<td>The Transmission Sense of Information</td><br />
<td>[[http://arxiv.org/abs/0810.4168]]</td><br />
</tr><br />
<tr><br />
<td>Bialek, William</td><br />
<td>wbialek@Princeton.EDU</td><br />
<td>Princeton University</td><br />
<td>Optimizing Information Flow in Small Genetic Networks</td><br />
<td>[[http://arxiv.org/abs/0912.5500]]</td><br />
</tr><br />
<tr><br />
<td>*Chaitin, Gregory</td><br />
<td>gjchaitin@gmail.com</td><br />
<td>IBM Watson Research Center</td><br />
<td>To a Mathematical Theory of Evolution and Biological Creativity</td><br />
<td>[[File:Darwin.pdf]]</td><br />
</tr><br />
<tr> <br />
<td>*Crutchfield, James</td><br />
<td>chaos@cse.ucdavis.edu</td><br />
<td>SFI & UC Davis</td><br />
<td>Framing Complexity</td><br />
<td>[[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]</td><br />
</tr><br />
<tr><br />
<td>*Debowski, Lukasz</td><br />
<td>ldebowsk@ipipan.waw.pl</td><br />
<td>Polish Academy of Sciences</td><br />
<td>The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts</td><br />
<td>[[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]</td><br />
</tr><br />
<tr><br />
<td>*Ellison, Christopher</td><br />
<td>cellison@cse.ucdavis.edu</td><br />
<td>UC Davis</td><br />
<td>Prediction, Retrodiction, and the Amount of Information Stored in the Present</td><br />
<td>[[http://arxiv.org/abs/0905.3587]]</td><br />
</tr><br />
<tr><br />
<td>*Feldman, David</td><br />
<td>dave@hornacek.coa.edu</td><br />
<td>College of the Atlantic</td><br />
<td>Spatial Information Theory</td><br />
<td>[[File:afm.tri.5.pdf]] and [[File:CHAOEH184043106_1.pdf]] </td><br />
</tr><br />
<tr><br />
<td>*Mahoney, John</td><br />
<td>jmahoney3@ucmerced.edu</td><br />
<td>UC Merced</td><br />
<td>Crypticity and Information Accessibility</td><br />
<td>[[http://arxiv.org/abs/0905.4787]]</td><br />
</tr><br />
<tr><br />
<td>*Machta, Jon</td><br />
<td>machta@physics.umass.edu</td><br />
<td>SFI & University of Massachusetts</td><br />
<td>Complexity, Parallel Computation and Statistical Physics</td><br />
<td>[[http://arxiv.org/abs/cond-mat/0510809]]</td><br />
</tr><br />
<tr><br />
<td>*Mitchell, Melanie</td><br />
<td>mm@cs.pdx.edu</td><br />
<td>SFI & Portland State University</td><br />
<td>Automatic Identification of Information-Processing Structures in Cellular Automata</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Still, Suzanne</td><br />
<td>sstill@hawaii.edu</td><br />
<td>University of Hawaii at Manoa</td><br />
<td>Statistical Mechanics of Interactive Learning</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>Tononi, Guilio</td><br />
<td>gtononi@wisc.edu</td><br />
<td>University of Michigan</td><br />
<td>Measuring the Complexity of Psychological States </td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Vilela-Mendes, Rui</td><br />
<td>vilela@cii.fc.ul.pt</td><br />
<td>University of Lisbon</td><br />
<td>Ergodic Parameters and Dynamical Complexity</td><br />
<td>[[http://arxiv.org/abs/1008.2664]]</td><br />
</tr><br />
<tr><br />
<td>Trabesinger, Andreas</td><br />
<td>a.trabesinger@nature.com</td><br />
<td>Nature Physics Magazine</td><br />
<td>Measuring Complexity?</td><br />
<td>[[http://www.nature.com/nphys/]]</td><br />
</tr><br />
<tr><br />
<td>*Wiesner, Karoline</td><br />
<td>k.wiesner@bristol.ac.uk</td><br />
<td>University of Bristol</td><br />
<td>Hidden Quantum Markov Models and Non-adaptive Read-out of Many-body States</td><br />
<td>[[http://arxiv.org/abs/1002.2337]]</td><br />
</tr><br />
</table><br />
*Confirmed</div>Jmachtahttps://wiki.santafe.edu/index.php?title=Randomness,_Structure_and_Causality_-_Bios&diff=38764Randomness, Structure and Causality - Bios2010-12-16T01:50:36Z<p>Jmachta: </p>
<hr />
<div>{{Randomness, Structure and Causality}}<br />
<br />
<table border="1"><br />
<tr><br />
<th>Name</th><br />
<th>Email</th><br />
<th>Institution</th><br />
<th>Talk</th><br />
<th>Paper</th><br />
</tr><br />
<tr><br />
<td>*Ay, Nihat</td><br />
<td>nay@mis.mpg.de</td> <br />
<td>SFI & Max Planck Institute</td><br />
<td>Effective Complexity of Stationary Process Realizations</td><br />
<td>[[http://arxiv.org/abs/1001.2686]]</td><br />
</tr><br />
<tr><br />
<td>*Bell, Tony</td><br />
<td>tony@salk.edu</td><br />
<td>UC Berkeley</td><br />
<td>Learning Out of Equilibrium</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>Bergstrom, Carl</td><br />
<td>cbergst@u.washington.edu</td><br />
<td>SFI & University of Washington</td><br />
<td>The Transmission Sense of Information</td><br />
<td>[[http://arxiv.org/abs/0810.4168]]</td><br />
</tr><br />
<tr><br />
<td>Bialek, William</td><br />
<td>wbialek@Princeton.EDU</td><br />
<td>Princeton University</td><br />
<td>Optimizing Information Flow in Small Genetic Networks</td><br />
<td>[[http://arxiv.org/abs/0912.5500]]</td><br />
</tr><br />
<tr><br />
<td>*Chaitin, Gregory</td><br />
<td>gjchaitin@gmail.com</td><br />
<td>IBM Watson Research Center</td><br />
<td>To a Mathematical Theory of Evolution and Biological Creativity</td><br />
<td>[[File:Darwin.pdf]]</td><br />
</tr><br />
<tr> <br />
<td>*Crutchfield, James</td><br />
<td>chaos@cse.ucdavis.edu</td><br />
<td>SFI & UC Davis</td><br />
<td>Framing Complexity</td><br />
<td>[[http://users.cse.ucdavis.edu/~cmg/compmech/pubs.htm]]</td><br />
</tr><br />
<tr><br />
<td>*Debowski, Lukasz</td><br />
<td>ldebowsk@ipipan.waw.pl</td><br />
<td>Polish Academy of Sciences</td><br />
<td>The Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts</td><br />
<td>[[http://arxiv.org/abs/0810.3125]] and [[http://arxiv.org/abs/0911.5318]]</td><br />
</tr><br />
<tr><br />
<td>*Ellison, Christopher</td><br />
<td>cellison@cse.ucdavis.edu</td><br />
<td>UC Davis</td><br />
<td>Prediction, Retrodiction, and the Amount of Information Stored in the Present</td><br />
<td>[[http://arxiv.org/abs/0905.3587]]</td><br />
</tr><br />
<tr><br />
<td>*Feldman, David</td><br />
<td>dave@hornacek.coa.edu</td><br />
<td>College of the Atlantic</td><br />
<td>Spatial Information Theory</td><br />
<td>[[File:afm.tri.5.pdf]] and [[File:CHAOEH184043106_1.pdf]] </td><br />
</tr><br />
<tr><br />
<td>*Mahoney, John</td><br />
<td>jmahoney3@ucmerced.edu</td><br />
<td>UC Merced</td><br />
<td>Crypticity and Information Accessibility</td><br />
<td>[[http://arxiv.org/abs/0905.4787]]</td><br />
</tr><br />
<tr><br />
<td>*Machta, Jon</td><br />
<td>machta@physics.umass.edu</td><br />
<td>SFI & University of Massachusetts</td><br />
<td>Complexity, Parallel Computation and Statistical Physics</td><br />
<td>[[http://arxiv.org/abs/cond-mat/0510809]]</td><br />
</tr><br />
<tr><br />
<td>*Mitchell, Melanie</td><br />
<td>mm@cs.pdx.edu</td><br />
<td>SFI & Portland State University</td><br />
<td>Automatic Identification of Information-Processing Structures in Cellular Automata</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Still, Suzanne</td><br />
<td>sstill@hawaii.edu</td><br />
<td>University of Hawaii at Manoa</td><br />
<td>Statistical Mechanics of Interactive Learning</td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>Tononi, Guilio</td><br />
<td>gtononi@wisc.edu</td><br />
<td>University of Michigan</td><br />
<td>Measuring the Complexity of Psychological States </td><br />
<td></td><br />
</tr><br />
<tr><br />
<td>*Vilela-Mendes, Rui</td><br />
<td>vilela@cii.fc.ul.pt</td><br />
<td>University of Lisbon</td><br />
<td>Ergodic Parameters and Dynamical Complexity</td><br />
<td>[[http://arxiv.org/abs/1008.2664]]</td><br />
</tr><br />
<tr><br />
<td>Trabesinger, Andreas</td><br />
<td>a.trabesinger@nature.com</td><br />
<td>Nature Physics Magazine</td><br />
<td>Measuring Complexity?</td><br />
<td>[[http://www.nature.com/nphys/]]</td><br />
</tr><br />
<tr><br />
<td>*Wiesner, Karoline</td><br />
<td>k.wiesner@bristol.ac.uk</td><br />
<td>University of Bristol</td><br />
<td>Hidden Quantum Markov Models and Non-adaptive Read-out of Many-body States</td><br />
<td>[[http://arxiv.org/abs/1002.2337]]</td><br />
</tr><br />
</table><br />
Confirmed</div>Jmachta