Monte Carlo methods



Finding functionals (traditionally integrals) approximately by guessing cleverly. Often, but not always, used for approximate statistical inference, especially certain Bayesian techniques. Probably the most prominent use case is Bayesian statistics, where various Monte Carlo methods turn out to be effective for various inference problems, especially Markov Chain ones. This is far from the only use however.

MC theory

General MC theory is big! As a taster, try Better than Monte Carlo is an incredibly compact introduction, explaining (Chopin and Gerber 2023; Novak 2015)

Say I want to approximate the integral \[ I(f):=\int_{[0,1]^s} f(u) d u \] based on \(n\) evaluations of function \(f\). I could use plain old Monte Carlo: \[ \hat{I}(f)=\frac{1}{n} \sum_{i=1}^n f\left(U_i\right), \quad U_i \sim \mathrm{U}\left([0,1]^s\right) . \] whose RMSE (root mean square error) is \(O\left(n^{-1 / 2}\right)\). Can I do better? That is, can I design an alternative estimator/algorithm, which performs \(n\) evaluations and returns a random output, such that its RMSE converge quicker? Surprisingly, the answer to this question has been known for a long time. If I am ready to focus on functions \(f \in \mathcal{C}^r\left([0,1]^s\right)\), Bakhvalov (1959) showed that the best rate I can hope for is \(O\left(n^{-1 / 2-r / s}\right)\). That is, there exist algorithms that achieve this rate, and algorithms achieving a better rate simply do not exist.

More lavish introduction: Reuven Y. Rubinstein and Kroese (2016),Kroese, Taimre, and Botev (2011).

Markov chain samplers

See Markov Chain Monte Carlo.

Multi-level Monte Carlo

Hmmm. Also multi scale monte carlo, multi index monte carlo. :construction🏗️

Population Monte Carlo

Not sure. See Cappé et al. (2004).

Importance sampling methods can be iterated like MCMC algorithms, while being more robust against dependence and starting values. The population Monte Carlo principle consists of iterated generations of importance samples, with importance functions depending on the previously generated importance samples. The advantage over MCMC algorithms is that the scheme is unbiased at any iteration and can thus be stopped at any time, while iterations improve the performances of the importance function, thus leading to an adaptive importance sampling.

Sequential Monte Carlo

Filed under particle filters, and

Quasi Monte Carlo

Don’t even guess randomly, but sample cleverly using the shiny Quasi Monte Carlo.

Cross Entropy Method

For automatically adapting an importance sampling distribution. TBC.

Monte Carlo gradient estimation

See MC gradients

References

Anderson, David F., and Desmond J. Higham. 2012. Multilevel Monte Carlo for Continuous Time Markov Chains, with Applications in Biochemical Kinetics.” Multiscale Modeling & Simulation 10 (1): 146–79.
Andrieu, Christophe, and Johannes Thoms. 2008. A Tutorial on Adaptive MCMC.” Statistics and Computing 18 (4): 343–73.
Atchadé, Yves, Gersende Fort, Eric Moulines, and Pierre Priouret. 2011. Adaptive Markov Chain Monte Carlo: Theory and Methods.” In Bayesian Time Series Models, edited by David Barber, A. Taylan Cemgil, and Silvia Chiappa, 32–51. Cambridge: Cambridge University Press.
Cappé, O, A Guillin, J. M Marin, and C. P Robert. 2004. Population Monte Carlo.” Journal of Computational and Graphical Statistics 13 (4): 907–29.
Casella, George, and Christian P. Robert. 1996. Rao-Blackwellisation of Sampling Schemes.” Biometrika 83 (1): 81–94.
Chopin, Nicolas, and Mathieu Gerber. 2023. Higher-Order Stochastic Integration Through Cubic Stratification.” arXiv.
Cranmer, Kyle, Johann Brehmer, and Gilles Louppe. 2020. The Frontier of Simulation-Based Inference.” Proceedings of the National Academy of Sciences 117 (48): 30055–62.
Elvira, Víctor, and Emilie Chouzenoux. 2021. “Optimized Population Monte Carlo,” 13.
Geffner, Tomas, and Justin Domke. 2018. Using Large Ensembles of Control Variates for Variational Inference.” In Proceedings of the 32nd International Conference on Neural Information Processing Systems, 9982–92. NIPS’18. Red Hook, NY, USA: Curran Associates Inc.
Giles, Michael B. 2008. Multilevel Monte Carlo Path Simulation.” Operations Research 56 (3): 607–17.
Giles, Michael B., and Lukasz Szpruch. 2014. Antithetic Multilevel Monte Carlo Estimation for Multi-Dimensional SDEs Without Lévy Area Simulation.” The Annals of Applied Probability 24 (4): 1585–1620.
Giles, Mike, and Lukasz Szpruch. 2012. Multilevel Monte Carlo Methods for Applications in Finance.” arXiv:1212.1377 [q-Fin], December.
Gu, Shixiang, Zoubin Ghahramani, and Richard E Turner. 2015. Neural Adaptive Sequential Monte Carlo.” In Advances in Neural Information Processing Systems 28, edited by C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, and R. Garnett, 2629–37. Curran Associates, Inc.
Haji-Ali, Abdul-Lateef, Fabio Nobile, and Raúl Tempone. 2016. Multi-Index Monte Carlo: When Sparsity Meets Sampling.” Numerische Mathematik 132 (4): 767–806.
Higham, Desmond J. 2015. An Introduction to Multilevel Monte Carlo for Option Valuation.” arXiv:1505.00965 [Physics, q-Fin, Stat], May.
Kroese, Dirk P., Thomas Taimre, and Zdravko I. Botev. 2011. Handbook of Monte Carlo Methods. Wiley Series in Probability and Statistics 706. Hoboken, N.J: Wiley.
Liu, Jun S. 1996. Metropolized Independent Sampling with Comparisons to Rejection Sampling and Importance Sampling.” Statistics and Computing 6 (2): 113–19.
Mohamed, Shakir, Mihaela Rosca, Michael Figurnov, and Andriy Mnih. 2020. Monte Carlo Gradient Estimation in Machine Learning.” Journal of Machine Learning Research 21 (132): 1–62.
Novak, Erich. 2015. Some Results on the Complexity of Numerical Integration.” arXiv.
Robert, Christian P., and George Casella. 2004. Monte Carlo Statistical Methods. 2nd ed. Springer Texts in Statistics. New York: Springer.
Rubinstein, Reuven Y., and Dirk P. Kroese. 2016. Simulation and the Monte Carlo Method. 3 edition. Wiley series in probability and statistics. Hoboken, New Jersey: Wiley.
Rubinstein, Reuven Y, and Dirk P Kroese. 2004. The Cross-Entropy Method a Unified Approach to Combinatorial Optimization, Monte-Carlo Simulation and Machine Learning. New York, NY: Springer New York.
Rubinstein, Reuven Y., Ad Ridder, and Radislav Vaisman. 2014. Fast Sequential Monte Carlo Methods for Counting and Optimization. Wiley Series in Probability and Statistics. Hoboken, New Jersey: Wiley.
Vehtari, Aki, Daniel Simpson, Andrew Gelman, Yuling Yao, and Jonah Gabry. 2019. Pareto Smoothed Importance Sampling.” arXiv:1507.02646 [Stat], July.
Virrion, Benjamin. 2020. Deep Importance Sampling.” arXiv:2007.02692 [q-Fin], July.
Walder, Christian J., Paul Roussel, Richard Nock, Cheng Soon Ong, and Masashi Sugiyama. 2019. New Tricks for Estimating Gradients of Expectations.” arXiv:1901.11311 [Cs, Stat], June.
Xia, Yuan. 2011. Multilevel Monte Carlo Method for Jump-Diffusion SDEs.” arXiv:1106.4730 [q-Fin], June.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.