Approximate Bayesian Computation is a terribly underspecified description. There are many ways that inference can be based upon simulations, many types of freedom from likelihood and many ways to approximate Bayesian computation. This page is about the dominant use of that term, which is the use of Simulation-based inference to do Bayes updates where the likelihood is not available but where we can simulate from the generative model.

Obviously there are other ways you can approximate Bayesian computation — see e.g. variational Bayes.

TBD: relationship between this and simulation-based inference in a frequentist setting, often called indirect inference. They look similar but tend not to cite each other. Is this a technical or sociological hurdle?

Miles Cranmer’s Introduction to Simulation-based inference.

## SMC for ABC

One can solve for ABC using Sequential Monte Carlo. TBD.

## Bayesian Synthetic Likelihood

TBD. Something about assuming the summary statistic is close to jointly Gaussian.

## Neural methods

See the Mackelab sbi page for several:

Goal: Algorithmically identify mechanistic models which are consistent with data.Each of the methods above needs three inputs: A candidate mechanistic model, prior knowledge or constraints on model parameters, and observational data (or summary statistics thereof).

The methods then proceed by

- sampling parameters from the prior followed by simulating synthetic data from these parameters,
- learning the (probabilistic) association between data (or data features) and underlying parameters, i.e. to learn statistical inference from simulated data. The way in which this association is learned differs between the above methods, but all use deep neural networks.
- This learned neural network is then applied to empirical data to derive the full space of parameters consistent with the data and the prior, i.e. the posterior distribution. High posterior probability is assigned to parameters which are consistent with both the data and the prior, low probability to inconsistent parameters. While SNPE directly learns the posterior distribution, SNLE and SNRE need an extra MCMC sampling step to construct a posterior.
- If needed, an initial estimate of the posterior can be used to adaptively generate additional informative simulations.

Code here: mackelab/sbi: Simulation-based inference in PyTorch

## References

*arXiv:1907.03382 [cs, Stat]*. http://arxiv.org/abs/1907.03382.

*arXiv:2102.07850 [cs, Stat]*, June. http://arxiv.org/abs/2102.07850.

*Proceedings of the National Academy of Sciences*, May. https://doi.org/10.1073/pnas.1912789117.

*Journal of the Royal Statistical Society: Series B (Methodological)*46 (2): 193–212. https://doi.org/10.1111/j.2517-6161.1984.tb01290.x.

*arXiv:1803.06645 [stat]*, March. http://arxiv.org/abs/1803.06645.

*arXiv:2103.02407 [stat]*, March. http://arxiv.org/abs/2103.02407.

*Stat*2 (1): 34–48. https://doi.org/10.1002/sta4.15.

*arXiv:1501.01265 [stat]*, January. http://arxiv.org/abs/1501.01265.

*Journal of Computational and Graphical Statistics*0 (0): 1–19. https://doi.org/10.1080/10618600.2021.1875839.

*arXiv:1902.04827 [stat]*, March. http://arxiv.org/abs/1902.04827.

*arXiv:2011.01808 [stat]*, November. http://arxiv.org/abs/2011.01808.

*Journal of Econometrics*59 (1–2): 5–33. https://doi.org/10.1016/0304-4076(93)90037-6.

*Journal of Computational and Graphical Statistics*28 (3): 481–92. https://doi.org/10.1080/10618600.2018.1546594.

*Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS)*, 54:1338–48. Proceedings of Machine Learning Research. Fort Lauderdale, FL, USA: PMLR. http://arxiv.org/abs/1610.09900.

*Symposium on Advances in Approximate Bayesian Inference*, 32–53. http://proceedings.mlr.press/v96/lueckmann19a.html.

*Proceedings of the Thirtieth Conference on Uncertainty in Artificial Intelligence*, 593–602. UAI’14. Arlington, Virginia, USA: AUAI Press. https://arxiv.org/abs/1401.2838v1.

*Advances in Neural Information Processing Systems 29*, edited by D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, and R. Garnett, 1028–36. Curran Associates, Inc. http://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf.

*Annals of Statistics*12 (4): 1151–72. https://doi.org/10.1214/aos/1176346785.

*arXiv:2103.10522 [stat]*, March. http://arxiv.org/abs/2103.10522.

*Psychological Methods*26 (1): 103–26. https://doi.org/10.1037/met0000275.

*Proceedings of the National Academy of Sciences*104 (6): 1760–65. https://doi.org/10.1073/pnas.0607208104.

*Handbook of Approximate Bayesian Computation*. CRC Press.

*arXiv:1808.00973 [hep-Ph, Physics:physics, Stat]*, August. http://arxiv.org/abs/1808.00973.

*arXiv:1804.06788 [stat]*, October. http://arxiv.org/abs/1804.06788.

*Advances in Neural Information Processing Systems 30*, edited by I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, 5523–33. Curran Associates, Inc. http://papers.nips.cc/paper/7136-hierarchical-implicit-models-and-likelihood-free-variational-inference.pdf.

## No comments yet. Why not leave one?