Less bored by this than I used to be, since it turns out that not everything is ideal gases and spin glasses, as cute as classical statmech is.
Baldassi, Carlo, Christian Borgs, Jennifer T. Chayes, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, and Riccardo Zecchina. 2016. “Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes.” Proceedings of the National Academy of Sciences 113 (48): E7655–62.
Barbier, Jean. 2015. “Statistical Physics and Approximate Message-Passing Algorithms for Sparse Linear Estimation Problems in Signal Processing and Coding Theory.” arXiv:1511.01650 [Cs, Math], November.
Castellani, Tommaso, and Andrea Cavagna. 2005. “Spin-Glass Theory for Pedestrians.” Journal of Statistical Mechanics: Theory and Experiment 2005 (05): P05012.
Ghavasieh, Arsham, Carlo Nicolini, and Manlio De Domenico. 2020. “Statistical Physics of Complex Information Dynamics.” Physical Review E 102 (5): 052304.
Lin, Henry W., and Max Tegmark. 2016a. “Critical Behavior from Deep Dynamics: A Hidden Dimension in Natural Language.” arXiv:1606.06737 [Cond-Mat], June.
———. 2016b. “Why Does Deep and Cheap Learning Work so Well?” arXiv:1608.08225 [Cond-Mat, Stat], August.
Mehta, Pankaj, and David J. Schwab. 2014. “An Exact Mapping Between the Variational Renormalization Group and Deep Learning.” arXiv:1410.3831 [Cond-Mat, Stat], October.
Onsager, L., and S. Machlup. 1953. “Fluctuations and Irreversible Processes.” Physical Review 91 (6): 1505–12.
Shwartz-Ziv, Ravid, and Naftali Tishby. 2017. “Opening the Black Box of Deep Neural Networks via Information.” arXiv:1703.00810 [Cs], March.
Wiatowski, Thomas, Philipp Grohs, and Helmut Bölcskei. 2018. “Energy Propagation in Deep Convolutional Neural Networks.” IEEE Transactions on Information Theory 64 (7): 1–1.