# Hamiltonian and Langevin Monte Carlo

Physics might be on to something

July 31, 2018 — November 14, 2022

Bayes
generative
geometry
how do science
information
Monte Carlo
physics
statistics

Hamiltonians, energy conservation in sampling. Handy. Summary would be nice.

## 1 Note salad from a Betancourt seminar

Michael Betancourt’s heuristic explanation of Hamiltonian Monte Carlo: sets of high mass, no good - we need the “typical set”, a set whose product of differential volume and density is high. Motivates Markov Chain Monte Carlo on this basis, a way of exploring typical set given points already in it, or getting closer to the typical set if starting without. How to get a central limit theorem? “Geometric” ergodicity results. Hamiltonian Monte Carlo is a procedure for generating measure-preserving floes over phase space

$H(q,p)=-\log(\pi(p|q)\pi(q))$ So my probability density gradient influences the particle momentum. And we can use symplectic integrators to walk through trajectories (if I knew more numerical quadrature I might know more about the benefits of this) in between random momentum perturbations. Some more stuff about resampling trajectories to de-bias numerical error, which is the NUTS extension to HMC.

## 2 Discontinuous likelihood

The solution is MOAR PHYSICS; we can construct hamiltonians which sample based on reflection/refraction dynamics in the augmented state space; see Afshar and Domke (2015);Nishimura, Dunson, and Lu (2020).

## 3 Incoming

Manifold Monte Carlo.

George Ho, Understanding NUTS and HMC

In terms of reading code, I’d recommend looking through Colin Carroll’s minimc for a minimal working example of NUTS in Python, written for pedagogy rather than actual sampling. For a “real world” implementation of NUTS/HMC, I’d recommend looking through my littlemcmc for a standalone version of PyMC3’s NUTS/HMC samplers.

## 4 References

Afshar, and Domke. 2015. “Reﬂection, Refraction, and Hamiltonian Monte Carlo.”
Bales, Pourzanjani, Vehtari, et al. 2019. arXiv:1905.11916 [Stat].
Betancourt. 2017. arXiv:1701.02434 [Stat].
———. 2018. Annalen Der Physik.
Betancourt, Byrne, Livingstone, et al. 2017. Bernoulli.
Carpenter, Hoffman, Brubaker, et al. 2015. arXiv Preprint arXiv:1509.07164.
Caterini, Doucet, and Sejdinovic. 2018. In Advances in Neural Information Processing Systems.
Dai, Singh, Dai, et al. 2020. “Learning Discrete Energy-Based Models via Auxiliary-Variable Local Exploration.”
Devlin, Horridge, Green, et al. 2021. “The No-U-Turn Sampler as a Proposal Distribution in a Sequential Monte Carlo Sampler with a Near-Optimal L-Kernel.”
Durmus, and Moulines. 2016. arXiv:1605.01559 [Math, Stat].
Girolami, and Calderhead. 2011. Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Goodrich, Gelman, Hoffman, et al. 2017. Journal of Statistical Software.
Hoffman, and Gelman. 2011. “The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo.” Arxiv Preprint arXiv:1111.4246.
Liu, Liu, and Ji. 2021.
Ma, Chen, and Fox. 2015. In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2. NIPS’15.
Mangoubi, and Smith. 2017. arXiv:1708.07114 [Math, Stat].
Margossian, Vehtari, Simpson, et al. 2020. arXiv:2004.12550 [Stat].
Mototake. 2019. “Conservation Law Estimation by Extracting the Symmetry of a Dynamical System Using a Deep Neural Network.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Neal. 2011. In Handbook for Markov Chain Monte Carlo.
Nishimura, Dunson, and Lu. 2020. Biometrika.
Norton, and Fox. 2016. arXiv:1610.00781 [Math, Stat].
Robert, Elvira, Tawn, et al. 2018. WIREs Computational Statistics.
Sansone. 2022. In Proceedings of the 39th International Conference on Machine Learning.
Strathmann, Sejdinovic, Livingstone, et al. 2015. In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 1. NIPS’15.
van de Meent, Paige, Yang, et al. 2021. arXiv:1809.10756 [Cs, Stat].
Xifara, Sherlock, Livingstone, et al. 2014. Statistics & Probability Letters.
Xu, Ge, Tebbutt, et al. 2019.