Hamiltonians, energy conservation in sampling. Handy. Summary would be nice.

Michael Betancourtβs heuristic explanation of Hamiltonian Monte Carlo: sets of high mass, no good - we need the βtypical setβ, a set whose product of differential volume and density is high. Motivates Markov Chain Monte Carlo on this basis, a way of exploring typical set given points already in it, or getting closer to the typical set if starting without. How to get a central limit theorem? βGeometricβ ergodicity results. Hamiltonian Monte Carlo is a procedure for generating measure-preserving floes over phase space

\[H(q,p)=-\log(\pi(p|q)\pi(q))\] So my probability density gradient influences the particle momentum. And we can use symplectic integrators to walk through trajectories (if I knew more numerical quadrature I might know more about the benefits of this) in between random momentum perturbations. Some more stuff about resampling trajectories to de-bias numerical error, which is the NUTS extension to HMC.

## Langevin Monte Carlo

π

## To file

Manifold Monte Carlo.

## References

*arXiv:1701.02434 [Stat]*, January.

*Annalen Der Physik*, March.

*Bernoulli*23 (4A): 2257β98.

*arXiv Preprint arXiv:1509.07164*.

*arXiv:1605.01559 [Math, Stat]*, May.

*Journal of the Royal Statistical Society: Series B (Statistical Methodology)*73 (2): 123β214.

*Journal of Statistical Software*76 (1).

*Handbook for Markov Chain Monte Carlo*, edited by Steve Brooks, Andrew Gelman, Galin L. Jones, and Xiao-Li Meng. Boca Raton: Taylor & Francis.

*arXiv:1610.00781 [Math, Stat]*, October.

*Statistics & Probability Letters*91 (Supplement C): 14β19.

## No comments yet. Why not leave one?