Finding functionals (traditionally integrals) approximately by guessing cleverly. Often, but not always, used for approximate statistical inference, especially certain Bayesian techniques. Probably the most prominent use case is Bayesian statistics, where various Monte Carlo methods turn out to be effective for various inference problems. This is far from the only use however.

## Markov chain samplers

## Multi-level Monte Carlo

Hmmm.
Also *multi scale monte carlo*, *multi index monte carlo*.
:construction🏗️

## Population Monte Carlo

Not sure. See Cappé et al. (2004).

Importance sampling methods can be iterated like MCMC algorithms, while being more robust against dependence and starting values. The population Monte Carlo principle consists of iterated generations of importance samples, with importance functions depending on the previously generated importance samples. The advantage over MCMC algorithms is that the scheme is unbiased at any iteration and can thus be stopped at any time, while iterations improve the performances of the importance function, thus leading to an adaptive importance sampling.

## Sequential Monte Carlo

Filed under particle filters, and

## Quasi Monte Carlo

Don’t even guess randomly, but sample cleverly using the shiny Quasi Monte Carlo.

## Cross Entropy Method

For automatically adapting an importance sampling distribution. TBC.

## References

*Multiscale Modeling & Simulation*10 (1): 146–79.

*Statistics and Computing*18 (4): 343–73.

*Bayesian Time Series Models*, edited by David Barber, A. Taylan Cemgil, and Silvia Chiappa, 32–51. Cambridge: Cambridge University Press.

*Journal of Computational and Graphical Statistics*13 (4): 907–29.

*Biometrika*83 (1): 81–94.

*Proceedings of the National Academy of Sciences*, May.

*Proceedings of the 32nd International Conference on Neural Information Processing Systems*, 9982–92. NIPS’18. Red Hook, NY, USA: Curran Associates Inc.

*Operations Research*56 (3): 607–17.

*The Annals of Applied Probability*24 (4): 1585–1620.

*arXiv:1212.1377 [q-Fin]*, December.

*Advances in Neural Information Processing Systems 28*, edited by C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, and R. Garnett, 2629–37. Curran Associates, Inc.

*Numerische Mathematik*132 (4): 767–806.

*arXiv:1505.00965 [Physics, q-Fin, Stat]*, May.

*Statistics and Computing*6 (2): 113–19.

*Journal of Machine Learning Research*21 (132): 1–62.

*Monte Carlo Statistical Methods*. 2nd ed. Springer Texts in Statistics. New York: Springer.

*Simulation and the Monte Carlo Method*. 3 edition. Wiley series in probability and statistics. Hoboken, New Jersey: Wiley.

*The Cross-Entropy Method a Unified Approach to Combinatorial Optimization, Monte-Carlo Simulation and Machine Learning*. New York, NY: Springer New York.

*Fast Sequential Monte Carlo Methods for Counting and Optimization*. Wiley Series in Probability and Statistics. Hoboken, New Jersey: Wiley.

*arXiv:1507.02646 [Stat]*, July.

*arXiv:2007.02692 [q-Fin]*, July.

*arXiv:1901.11311 [Cs, Stat]*, June.

*arXiv:1106.4730 [q-Fin]*, June.

## No comments yet. Why not leave one?