CSSS 2009 Santa Fe-Tutorials: Difference between revisions
From Santa Fe Institute Events Wiki
Line 79: | Line 79: | ||
[[Liliana Salvador]]: Good idea. I am in! I have some knowledge of matlab (but never used very fancy packages). I would love to learn Mathematica. | [[Liliana Salvador]]: Good idea. I am in! I have some knowledge of matlab (but never used very fancy packages). I would love to learn Mathematica. | ||
== [[Statistical physics: applications to complex systems | == [[Statistical physics: applications to complex systems]] == | ||
[[Lucas Lacasa]]:Statistical physics is a rather huge field, so I'm thinking on building a tutorial that focus on some specific topics related to complexity science in a chat-like level, namely: | [[Lucas Lacasa]]:Statistical physics is a rather huge field, so I'm thinking on building a tutorial that focus on some specific topics related to complexity science in a chat-like level, namely: | ||
Revision as of 18:29, 13 June 2009
CSSS Santa Fe 2009 |
Feel free to organize your own tutorials.
Post tutorial schedules here.
For ease of scheduling, please make amendments to the schedule
Also, a tutorial outline page wouldn't be bad.
Information theory leading on to Maximum Entropy (and maybe maximum entropy production)
Steven Lade: Well, nobody responded to my request so perhaps I'll do it myself :) I learnt a little at another summer school, so maybe I'm the best qualified, I was just hoping I wasn't. :)
Maximum entropy comes out of information theory (which I'm not expert in, so if there's someone knowledgeable to help me there, that would be great). It states simply that the most likely state of a system is the one which maximises its (information-theoretic) entropy. The entropy of statistical mechanics comes naturally from it: indeed, stat mech can be viewed as a subset of information theory from this perspective. But you can apply it to any sort of system, not just physics. It's the closest thing to a 'theory of complex systems' that I've seen!
Tentative time: 7PM, Monday June 15. Add your name if interested.
- I am interested and will happily wear the skeptic hat: essentially if the entropy is sensitive to your parametrization, it loses much of its meaning. I think this is why some people (such as Amari) like to do "coordinate-free statistics". -Gustavo Lacerda
- I'm no expert, but I do have information theory basics under my belt... and more importantly, I happen to have some introductory info. theo. latex slides for a talk I recently gave. So I can offer help with the background, and then sit back & listen to the higher level, post-Shannon applications :) -Rosemary Braun
- On edit: I just noticed Lucas & Roozbeh requested an info theo tutorial -- shall we just roll it all together? -Rosemary Braun
- Lucas Lacasa Great, I'm in. Steve, with the Max-ent stuff you mean Jaynes? I've seen Jaynes formalism two or three times but never saw the difference with a merely Lagrange multiplier optimization, so great to hear about that. Rosemary, I would also be glad to learn some information theory stuff. I know the ultra-basics (Shannon entropy, mutual information, and not much more...).
Angela Onslow: I'd like to come along to this
Roozbeh Daneshvar: FYI, We have ABM tutorial on Monday, June 15th starting at 04:30 PM. I assume that it is finished by 07:00 PM.
Kate Behrman: I also do not claim to be an expert, but have gone over a lot of the theory that is commonly used in biology. I would like to know what physicists think. Count me in!
Steven Lade: Rosemary, would be great if you could start with information theory. Lucas, yes I do mean Jaynes. Lagrange multiplier optimisation is really all that happens, so maybe it is all the same -- I confess I don't know much stat mech either! This session is going to be very much everyone contributing I think. And sounds like you may know more than me, Kate.
Allison Shaw: I'd like to come -- I've had the basics of information theory, but could definitely use a refresher and I'd love to learn the more advanced stuff.
watson: Sounds great. I know nothing about entropy maximization (other than the basic statement of the 2nd law of thermodynamics). But I would be glad to help out Rosemary with explaining some info theory stuff... Rosemary, perhaps we can chat before Monday. I'm thinking: entropy, conditional entropy, mutual information and possibly some topics beyond.
Elliot Martin: There are some differences between maximum entropy in thermodynamics, and what Jaynes does. I would be happy to help out if you want, but I just know a little bit about this stuff.
Massimo Mastrangeli: I am definitely not an expert, anyway I had several bits of information theory throughout time, so I'd like to wrap it up somehow. I'll join you.
Lara Danilova-Burdess Count me in
Matt McMahon: I'm in.
Erin Taylor: Thanks Steven! I'll be there.
Liliana Salvador: Hey! This is great! I would love to join. I studied some information theory some time ago and its relationship with Kolmogorov Complexity (KC). It was a long time ago, but this is a good excuse to revise it. Any interest in hearing about this? Maybe I could give some basic introduction about KC and its relationship with information theory and then we could discuss the advantages and disadvantages of each method to measure information. What do you think?
MATLAB / Mathematica. Monday June 22nd 7pm (?)
Steven Lade: Hiro, I will happily give a tutorial on the basics of these languages (but only the basics, I don't use any of the fancy bits). Anyone here know if the labs here run MATLAB/Mathematica?
If there is anyone else interested (leave your name) I'll arrange a time, otherwise I'll talk directly with you Hiro.
Karen Simpson: I would also like a tutorial on MATLAB. I know some things about it, but never really learned the basics so it takes me a long time to do things. The computer labs should be equipped with MATLAB. I also have a fairly updated version on my laptop.
Lucas LacasaI'm also interested! Also know some basics but I usually program in Fortran so I'd love to learn it.
Steven, many thanks! I think that the computers in the lab have MATLAB. If not, we can access it through my school's server as long as the internet connection is stable. Hirotoshi Yoshioka/lakiaypayaska
Brian Hollar I'm also very interested! I've never used these languages, but have some basic knowledge of Java, NetLogo, and FORTRAN. I'd appreciate the help and would love to learn.
Milena Tsvetkova Count me in! Steve, what is a good time for you?
Wei Ni: Hi ppl. Could I join in? I know MATLAB and I would like to further polish my MATLAB skills. Meanwhile I want to learn Mathematica.
Thanks for coordinating the tutorial and for 'setting up' the facilities, Steve and Lucas.
Chang Yu:Hello,I'm here! Thank you guys for this tutorial.I'm a kid in MATLAB and very courious about it.I know M is a giant in Mathematics,Statistic Analysis,Images and Genetic Algorithm and really want to learn more about that.By the way, I have the installation of Matlab 7.0 and I'll take it in my flash drive.If you have updated version, that would be better.
Steven Lade: Too late for this week I think, and next week is pretty full, so let's aim for 7PM Monday June 22.
Elliot Martin: I would be happy to help out with the MATLAB tutorial if you want. I don't use any of the fancy packages either though.
Marek Kwiatkowski: Excellent tutorial idea, I'll be there.
Margreth Keiler: I would like to join. I've never used these languages but I would like to get an idea.
Liliana Salvador: Good idea. I am in! I have some knowledge of matlab (but never used very fancy packages). I would love to learn Mathematica.
Statistical physics: applications to complex systems
Lucas Lacasa:Statistical physics is a rather huge field, so I'm thinking on building a tutorial that focus on some specific topics related to complexity science in a chat-like level, namely:
- Fundamentals of statistical mechanics: ensembles, partition function and associated thermodynamic quantities (free energy, entropy) and some other basic stuff.
- Critical phenomena: Phase transitions in physical, social, and algorithmic systems. Self-organized criticality as the counterpart of a critical phase transition. Relation between phase transitions and local bifurcations of dynamical systems.
- Monte Carlo simulations, ergodic theorem
- Specific example gathering all of the above: Ising model
- Other related topics that you may like to listen to
Maybe we could schedule it for next week, something like next wednesday (17) at 7pm? (provided that no PRIORITY things such as basketball or soccer games are scheduled). If anyone else is interested please leave your name. Depending on the 'audience' we can fix one place or another...
- Steven Lade Fantastic, I'm in. But there is the 'Music on the Hill' 6-8pm Wednesdays. Is 8pm getting too late for 'work'? Or I miss an hour of the music. (Tuesday and thursdays there's the nonlinear dynamics labs)
- Lucas Lacasa You're right. What about friday 19 after lunch?
- Roozbeh Daneshvar I'd like to join this one.
- Corinne Teeter I'd also like to come.
- Barbara Bauer I'm interested too.
- Hirotoshi Yoshioka: I'd like to come.
- Gustavo Lacerda: I'm interested.
- Allison Shaw: I'd like to join as well.
- watson: maybe you guys should rope
- Andrew Berdahl into this as well. sounds like he could have something to add...
- Elliot Martin: I could talk a bit about self organized criticality, and a bit about some mean field theory if you want some company
- Jacopo Tagliabue: I'm interested, especially if tutorial includes a for-dummies part.
- Massimo Mastrangeli: I will join you!
- Lara Danilova-Burdess: Excellent. I'd like to come.
- Mareen Hofmann: Super! I'm in as well.
- Margreth Keiler: I'm interested, and join the for-dummies part ;-)
- Liliana Salvador: I am in!
Mathematical modeling in ecology. Wednesday, June 17th 7pm (?)
If we want to do this before the talks next week, Monday evening?
- Barbara Bauer whoah, I wish i had an overview on the whole field...
But a subset of this are food webs: recommended readings for the next week's talks of Jennifer Dunne & Neo Martinez are Berlow et al. 2009 Simple predictions of interaction strengths in complex food webs and Dunne et al. 2008 Compilation and network analysis of Cambrian food webs (i can send both if you're interested). This is the latest and probably best of their research agenda, so we could discuss them, and Sasha & me could talk about different approaches/remaining big unsolved questions of community ecology. It would be cool if it'd happen before the talks (this would enable better questions and discussion during the talks), but looking at our schedule, there's not much chance for this...
About other ecol.modeling: I guess I could give a general introduction to it, which will probably be biased towards the stuff i actually know something about- ODE-based modeling, networks and probably adaptive dynamics. For the sake of complexity, I can tell you on what fields they use other types of models (e.g. individual-based or spatial distribution models), but i don't know too much about the details of these.
- Caroline Farrior I agree an overview of the whole field is difficult. I would be willing to take part in this discussion and talk about the kinds of work I do with forest models and models I have seen done in many other areas of Ecology. There a lot of different kinds of models in Ecology which all have their own goals and utilities in combination with data and ideas. I would be very happy to listen to and perhaps contribute to such a discussion.
- Roozbeh Daneshvar I'd like to join. I do not have much background in this area and any amount of review before the relevant classes is highly appreciated.
- Marek Kwiatkowski I would be interested in learning about this too.
- Andrew Noble This all sounds great. I'd be happy to lead a brief overview of stochastic process models in community ecology (focusing on Hubbell's neutral theory of biodiversity). How about next Wednesday at 7pm? I'd like to attend the MaxEnt workshop Monday evening, and there are conflicts with Hubler's nonlinear workshops on Tuesday and Thursday.
- Jacopo Tagliabue I'd like to join, since it sounds very interesting. I do not have much background, so some introductory remarks will be appreciated!
Tutorial requests!
Either a formal 'lecture' or a casual 'chat' is fine!
- Time series analysis (requested by Steve Lade)
- I second this request, particularly nonlinear TS and/or using TISEAN (Matt McMahon)
- Maximum entropy / Maximum entropy production (requested by Steve Lade)
- Mathematical modelling in ecology (requested by Steve Lade)
- Some physics stuff: statistical mechanics; mean field theory; self-organized criticality and phase transitions; Ising model and the like (requested by Mareen Hofmann and Roozbeh Daneshvar)
- Lucas Lacasa I can talk you about phase transitions, Ising model, SOC and general stat phys...
- Steven Lade I'd like to hear about this too; can you schedule a tutorial?
- Evolutionary game theory (requested by Mareen Hofmann, Roozbeh Daneshvar, Massimo Mastrangeli)
- Ergodic theory (requested by Roozbeh Daneshvar)
- Information theory (requested by Roozbeh Daneshvar and Lucas Lacasa)
- Spectral graph theory (requested by Lucas Lacasa)
- Spin glass theory: Replica method (requested by Lucas Lacasa)
- If somebody can give a tutorial on Matlab and/or Mathematica, that would be nice. I'm also interested in the difference between the two programs (e.g., what each program is good at). Thanks in advance! Hiro/lakiaypayaska
- Fitting high dimensional data with functions--is there such thing as 4D maximum likelihood estimation (MLE)? Also how to tell if two 'blobs' of high dimensional data are statistically the same or different.Corinne Teeter
- The MLE is defined for any parameter vector, i.e. any number of dimensions. Finding it is a different story, and can be hard if the objective is not convex. The fully non-parametric ("assumption-free") way to do this is a two-sample test. For 1D data, you could use something like the Kolmogorov-Smirnov test or Cramer-von Mises criterion. Essentially you are stuffing the points from sample 2 into bins defined by the points from sample 1 (or vice-versa): the null hypothesis is that the bins are filled uniformly. You want to find a generalization of this into higher dimensions. I imagine that textbooks on spatial statistics would have this information.
UPDATE: I think you could use Voronoi cells as your bins, and this works in any number of dimensions! -Gustavo Lacerda
- Bayesian inference. Maximum likelihood estimation. (requested by Steve Lade)
- I'd be happy to give a "casual chat" on this. Gustavo Lacerda
- I'd like to learn about normal modes and the Lagrangian and/or Hamiltonian formulation of classical mechanics. -Gustavo Lacerda **I would be glad to discuss this with you. You might also be interested in how these deterministic systems give rise to randomness Nathan Hodas
- I'd like to learn some beyond-the-basic econometrics and data analysis, possibly focusing on how to build models in social sciences. Anyone? -Jacopo Tagliabue and Roozbeh Daneshvar
- I'd love to get introduced to game theory (Massimo Mastrangeli and Roozbeh Daneshvar)
- Anyone can tell me about graph grammars? (Massimo Mastrangeli)