Boaz Barak has a miniature dictionary for statisticians:

I’ve always been curious about the statistical physics approach to problems from computer science. The physics-inspired algorithm survey propagation is the current champion for random 3SAT instances, statistical-physics phase transitions have been suggested as explaining computational difficulty, and statistical physics has even been invoked to explain why deep learning algorithms seem to often converge to useful local minima.

Unfortunately, I have always found the terminology of statistical physics, “spin glasses”, “quenched averages”, “annealing”, “replica symmetry breaking”, “metastable states” etc… to be rather daunting

Jaan Altosaar’s guided translation is great.

## Phase transitions in statistical inference

There is a deep analogy between statistical inference and statistical physics; I will give a friendly introduction to both of these fields. I will then discuss phase transitions in two problems of interest to a broad range of data sciences: community detection in social and biological networks, and clustering of sparse high-dimensional data. In both cases, if our data becomes too sparse or too noisy, it suddenly becomes impossible to find the underlying pattern, or even tell if there is one. Physics both helps us locate these phase transitions, and design optimal algorithms that succeed all the way up to this point. Along the way, I will visit ideas from computational complexity, random graphs, random matrices, and spin glass theory.

There is an overview lecture by Thomas Orton, which cites lots of the good stuff

Last week, we saw how certain computational problems like 3SAT exhibit a thresholding behavior, similar to a phase transition in a physical system. In this post, we’ll continue to look at this phenomenon by exploring a heuristic method, belief propagation (and the cavity method), which has been used to make hardness conjectures, and also has thresholding properties. In particular, we’ll start by looking at belief propagation for approximate inference on sparse graphs as a purely computational problem. After doing this, we’ll switch perspectives and see belief propagation motivated in terms of Gibbs free energy minimization for physical systems. With these two perspectives in mind, we’ll then try to use belief propagation to do inference on the the stochastic block model. We’ll see some heuristic techniques for determining when BP succeeds and fails in inference, as well as some numerical simulation results of belief propagation for this problem. Lastly, we’ll talk about where this all fits into what is currently known about efficient algorithms and information theoretic barriers for the stochastic block model.

See Igor Carron’s “phase diagram” list, and stuff like (Oymak and Tropp 2015). Likely there are connections to Erdős-Renyi giant components and other complex network things in probabilisitic graph learning. Read (Barbier 2015; Poole et al. 2016).

## Replicator equations and evolutionary processes

See also evolution, game theory.

Gentle intro lecture by John Baez, Biology as Information Dynamics.

See (Baez 2011; Harper 2009; Shalizi 2009; Sinervo and Lively 1996).

## References

*Annual Review of Condensed Matter Physics*11 (1): 501–28. https://doi.org/10.1146/annurev-conmatphys-031119-050745.

*Proceedings of the National Academy of Sciences*113 (48): E7655–62. https://doi.org/10.1073/pnas.1608103113.

*Journal of Statistical Mechanics: Theory and Experiment*2005 (05): P05012. https://doi.org/10.1088/1742-5468/2005/05/P05012.

*Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics*, 192–204. http://proceedings.mlr.press/v38/choromanska15.html.

*Inverse Problems*34 (1): 014004. https://doi.org/10.1088/1361-6420/aa9a90.

*Physical Review Letters*113 (14). https://doi.org/10.1103/PhysRevLett.113.148103.

*The Annals of Mathematical Statistics*33 (3): 1021–38. https://doi.org/10.1214/aoms/1177704470.

*Bulletin of the EATCS*, February. http://arxiv.org/abs/1702.00467.

*Advances in Neural Information Processing Systems 29*, edited by D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, and R. Garnett, 3360–68. Curran Associates, Inc. http://papers.nips.cc/paper/6322-exponential-expressivity-in-deep-neural-networks-through-transient-chaos.pdf.

*Electronic Journal of Statistics*3: 1039–74. https://doi.org/10.1214/09-EJS485.

*Nature*380 (6571): 240. https://doi.org/10.1038/380240a0.

*Annual Review of Statistics and Its Application*4 (1): 447–79. https://doi.org/10.1146/annurev-statistics-060116-054026.

*Physica D: Nonlinear Phenomena*, Novel Computing Paradigms: Quo Vadis?, 237 (9): 1257–81. https://doi.org/10.1016/j.physd.2008.03.040.

*Advances in Physics*65 (5): 453–552. https://doi.org/10.1080/00018732.2016.1211393.