Approximate Bayesian Computation is a terribly underspecified description. There are many ways that inference can be based upon simulations, many types of freedom from likelihood and many ways to approximate Bayesian computation. This page is about the dominant use of that term, which is the use of Simulation-based inference to do Bayes updates where the likelihood is not available but where we can simulate from the generative model.

Obviously there are other ways you can approximate Bayesian computation β see e.g. variational Bayes.

TBD: relationship between this and simulation-based inference in a frequentist setting, often called indirect inference. They look similar but tend not to cite each other. Is this a technical or sociological hurdle?

Miles Cranmerβs Introduction to Simulation-based inference.

## SMC for ABC

One can solve for ABC using Sequential Monte Carlo. TBD.

## Bayesian Synthetic Likelihood

TBD. Something about assuming the summary statistic is close to jointly Gaussian.

## Neural methods

See the Mackelab sbi page for several:

Goal: Algorithmically identify mechanistic models which are consistent with data.Each of the methods above needs three inputs: A candidate mechanistic model, prior knowledge or constraints on model parameters, and observational data (or summary statistics thereof).

The methods then proceed by

- sampling parameters from the prior followed by simulating synthetic data from these parameters,
- learning the (probabilistic) association between data (or data features) and underlying parameters, i.e. to learn statistical inference from simulated data. The way in which this association is learned differs between the above methods, but all use deep neural networks.
- This learned neural network is then applied to empirical data to derive the full space of parameters consistent with the data and the prior, i.e. the posterior distribution. High posterior probability is assigned to parameters which are consistent with both the data and the prior, low probability to inconsistent parameters. While SNPE directly learns the posterior distribution, SNLE and SNRE need an extra MCMC sampling step to construct a posterior.
- If needed, an initial estimate of the posterior can be used to adaptively generate additional informative simulations.

Code here: mackelab/sbi: Simulation-based inference in PyTorch

## Generalized Bayesian computation

A KL-free Bayesian computation extension. See generalized Bayes.

## References

*Entropy*22 (11): 1272.

*arXiv:1907.03382 [Cs, Stat]*.

*Genetics*162 (4): 2025β35.

*Statistics and Computing*20 (1): 63β73.

*arXiv:2102.07850 [Cs, Stat]*, June.

*Proceedings of the National Academy of Sciences*, May.

*Journal of the Royal Statistical Society: Series B (Methodological)*46 (2): 193β212.

*arXiv:1803.06645 [Stat]*, March.

*arXiv:2103.02407 [Stat]*, March.

*Stat*2 (1): 34β48.

*arXiv:1501.01265 [Stat]*, January.

*Journal of Computational and Graphical Statistics*0 (0): 1β19.

*arXiv:1902.04827 [Stat]*, March.

*arXiv:2011.01808 [Stat]*, November.

*Journal of Econometrics*59 (1β2): 5β33.

*Transactions on Machine Learning Research*, January.

*Journal of Computational and Graphical Statistics*28 (3): 481β92.

*Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS)*, 54:1338β48. Proceedings of Machine Learning Research. Fort Lauderdale, FL, USA: PMLR.

*University of California, Berkeley, Rep*779: 23.

*Symposium on Advances in Approximate Bayesian Inference*, 32β53.

*Proceedings of the Thirtieth Conference on Uncertainty in Artificial Intelligence*, 593β602. UAIβ14. Arlington, Virginia, USA: AUAI Press.

*Statistics and Computing*22 (6): 1273β76.

*Computational Statistics & Data Analysis*128 (December): 271β91.

*Statistics and Computing*28 (4): 971β88.

*Advances in Neural Information Processing Systems 29*, edited by D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, and R. Garnett, 1028β36. Curran Associates, Inc.

*Annals of Statistics*12 (4): 1151β72.

*arXiv:2103.10522 [Stat]*, March.

*Psychological Methods*26 (1): 103β26.

*arXiv:2011.08644 [Stat]*, February.

*Proceedings of the National Academy of Sciences*104 (6): 1760β65.

*Handbook of Approximate Bayesian Computation*. CRC Press.

*arXiv:1808.00973 [Hep-Ph, Physics:physics, Stat]*, August.

*arXiv:1804.06788 [Stat]*, October.

*Advances in Neural Information Processing Systems 30*, edited by I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, 5523β33. Curran Associates, Inc.

*Journal of Computational and Graphical Statistics*26 (4): 873β82.

*arXiv:2110.14082 [q-Bio, Stat]*, October.

## No comments yet. Why not leave one?