# Langevin dynamcs MCMC

August 17, 2020 — September 5, 2024

Sampling using the particular SDE that is the Langevin equation. It can be rather close to optimising, via SDE representations of SGD.

## 1 Langevin dynamics

## 2 Metropolis-adjusted Langevin algorithm (MALA)

## 3 Continuous time

See log concave distributions for a family of distributions where this works especially well because implcit (more nearly continuous-time exact) solutions are available Hodgkinson, Salomone, and Roosta (2019).

Left-field, Max Raginsky, Sampling Using Diffusion Processes, from Langevin to Schrödinger:

the Langevin process gives only approximate samples from \(\mu\). I would like to discuss an alternative approach that uses diffusion processes to obtain exact samples in finite time. This approach is based on ideas that appeared in two papers from the 1930s by Erwin Schrödinger in the context of physics, and is now referred to as the Schrödinger bridge problem.

## 4 Annealed

TBC Jolicoeur-Martineau et al. (2022);Song and Ermon (2020a);Song and Ermon (2020b).

Yang Song, Generative Modeling by Estimating Gradients of the Data Distribution

## 5 Incoming

Holden Lee, Andrej Risteski introduce the connection between log-concavity and convex optimisation.

\[ x_{t+\eta} = x_t - \eta \nabla f(x_t) + \sqrt{2\eta}\xi_t,\quad \xi_t\sim N(0,I). \]

- Stochastic gradient Markov chain Monte Carlo - lyndonduong.com
- alisiahkoohi/Langevin-dynamics: Sampling with gradient-based Markov Chain Monte Carlo approaches
- langevin-monte-carlo/ula.py at master · abdulfatir/langevin-monte-carlo
- PYSGMCMC – Stochastic Gradient Markov Chain Monte Carlo Sampling — pysgmcmc documentation
- Stochastic gradient Langevin dynamics - Wikipedia

## 6 References

*Proceedings of the 29th International Coference on International Conference on Machine Learning*. ICML’12.

*Proceedings of the 39th International Conference on Machine Learning*.

*Groundwater*.

*The Annals of Applied Probability*.

*Proceedings of the 32nd International Conference on Neural Information Processing Systems*. NIPS’18.

*Proceedings of the 31st International Conference on Machine Learning*.

*arXiv:1704.04752 [Math, Stat]*.

*Proceedings of the 27th International Conference on Neural Information Processing Systems - Volume 2*. NIPS’14.

*PMLR*.

*arXiv:1605.01559 [Math, Stat]*.

*Proceedings of the 32nd International Conference on Machine Learning*.

*SIAM Journal on Applied Dynamical Systems*.

*arXiv:1812.00793 [Cs, Math, Stat]*.

*Journal of the Royal Statistical Society: Series B (Statistical Methodology)*.

*Journal of the Royal Statistical Society: Series B (Methodological)*.

*Journal of Machine Learning Research*.

*Communications in Mathematical Sciences*.

*The Annals of Applied Probability*.

*arXiv:1903.12322 [Cs, Stat]*.

*arXiv:2006.11239 [Cs, Stat]*.

*Proceedings of the 36th International Conference on Machine Learning*.

*JMLR*.

*Soft Matter*.

*Proceedings of the 20th Conference on Uncertainty in Artificial Intelligence*. UAI ’04.

*arXiv:1610.00781 [Math, Stat]*.

*Nuclear Physics B*.

*Stochastic Processes and Applications: Diffusion Processes, the Fokker-Planck and Langevin Equations*. Texts in Applied Mathematics.

*Statistics & Probability Letters*.

*Journal of the Royal Statistical Society. Series B (Statistical Methodology)*.

*Bernoulli*.

*Reports on Progress in Physics*.

*Advances in Neural Information Processing Systems*. NIPS’15.

*Advances In Neural Information Processing Systems*.

*Advances In Neural Information Processing Systems*.

*Proceedings of the 28th International Conference on International Conference on Machine Learning*. ICML’11.

*Statistics & Probability Letters*.

*Proceedings of the 39th International Conference on Machine Learning*.

*International Conference on Artificial Intelligence and Statistics*.