Actions

Deep Computation in Statistical Physics

From Santa Fe Institute Events Wiki

Workshop Navigation

Deep Computation in Statistical Physics Workshop
organized by Cris Moore (SFI), Jon Machta (University of Massachusetts, Amherst & SFI), and Stephan Mertens (Magdeburg & SFI)
Santa Fe Institute
August 8-10, 2013


Abstract
Computational physics was born in 1953, when Metropolis, Metropolis, Rosenbluth, Teller, and Teller at Los Alamos introduced the idea of Monte Carlo simulations to study the properties of fluids. The same year, Enrico Fermi, John R. Pasta, and Stanislaw Ulam used the MANIAC-I computer at Los Alamos to simulate a system of nonlinearly coupled oscillators. Sparked by these two seminal papers, simulations have become the third source of knowledge besides the much older sources theory and experiment.

Most simulations "put the physics on a computer" by mimicking physical processes in a straightforward way. We flip single spins in magnets or trace individual molecules on their way to the next collision. This is useful and natural, yet it does not deploy the full power of the algorithmic approach to physics. Complex systems can be investigated more deeply by advanced and innovative algorithms that do not try to imitate physics, but try to probe physical problems using a deep mathematical understanding of the system—that skip over much of the intermediate behavior and directly access the long-time or large-scale behavior of the system. Examples of this kind of "deep computation" include algorithms that compute minimum spanning trees for percolation systems, or Monte Carlo simulations that use non-local, unphysical updates to sample from the equilibrium distribution, or find a system’s ground state, much faster than the natural physical dynamics would.

Not only do these algorithms provide new tools for studying complex systems, but the workings of these algorithms often yield fresh insights into the systems themselves. For example, the Swendsen-Wang cluster algorithm depends on a subtle mapping between spin models and percolation models and gives a geometric interpretation to the ordering that occurs at the critical point of spin systems. At a more fundamental level, understanding the best algorithms available for simulating a physical system sheds light on the natural complexity of the system itself.

In this workshop we will bring together physicists, computer scientists and practitioners from related fields to share knowledge and collaborate on the development, analysis and application of new algorithms. We will particularly emphasize algorithms applied to systems studied in statistical physics because of their simple formulation and wide relevance to many other scientific fields. A paradigmatic example is the Ising model and its many variants. Originally designed to model magnetic systems but now applied to fields as diverse as image analysis and opinion formation in social systems, Ising systems are simply defined but display a rich array of complex behavior induced by many-body interactions. Because most complex systems studied in statistical physics are stochastic we will focus mainly, though not exclusively, on Monte Carlo algorithms, i.e., algorithms that rely essentially on pseudorandomness for their operation.

The year 2013 is the 60th anniversary of the seminal paper of Metropolis et. al. that initiated the simulation age, and made possible the quantitative understanding of complex systems. Moreover, powerful new approaches have been developed in the past decade that are not widely disseminated between fields, and whose potential has yet to be fully developed. For instance, simulated annealing is widely known, but more advanced techniques like parallel tempering are just starting to be applied outside statistical physics. Similarly, lifting Markov chains by giving the state a kind of momentum can make them much faster; but it also makes them irreversible, putting them outside the reach of many standard tools for analyzing their equilibration times.

The time is ripe to bring together physicists, computer scientists, and others in complex systems to accelerate the development and understanding, both experimental and rigorous, of these new algorithmic approaches. And by doing this at SFI, we hope to establish that SFI is one of the institutional owners of advanced algorithms and mathematically powerful techniques.