Hamiltonians, energy conservation in sampling. Handy. Summary would be nice.

Michael Betancourt’s heuristic explanation of Hamiltonian Monte Carlo: sets of high mass, no good - we need the “typical set”, a set whose product of differential volume and density is high. Motivates Markov Chain Monte Carlo on this basis, a way of exploring typical set given points already in it, or getting closer to the typical set if starting without. How to get a central limit theorem? “Geometric” ergodicity results. Hamiltonian Monte Carlo is a procedure for generating measure-preserving floes over phase space

\[H(q,p)=-\log(\pi(p|q)\pi(q))\] So my probability density gradient influences the particle momentum. And we can use symplectic integrators to walk through trajectories (if I knew more numerical quadrature I might know more about the benefits of this) in between random momentum perturbations. Some more stuff about resampling trajectories to de-bias numerical error, which is the NUTS extension to HMC.

## Langevin Monte Carlo

🏗

## To file

Manifold Monte Carlo.

## References

*Annalen Der Physik*, March. https://doi.org/10.1002/andp.201700214.

*Bernoulli*23 (November): 2257–98. https://doi.org/10.3150/16-BEJ810.

*Journal of the Royal Statistical Society: Series B (Statistical Methodology)*73 (2): 123–214. https://doi.org/10.1111/j.1467-9868.2010.00765.x.

*Journal of Statistical Software*76 (1). https://doi.org/10.18637/jss.v076.i01.

*Handbook for Markov Chain Monte Carlo*, edited by Steve Brooks, Andrew Gelman, Galin L. Jones, and Xiao-Li Meng. Boca Raton: Taylor & Francis. http://arxiv.org/abs/1206.1901.

*Statistics & Probability Letters*91 (August): 14–19. https://doi.org/10.1016/j.spl.2014.04.002.