Approximate Bayesian Computation

Posterior updates without likelihood



Approximate Bayesian Computation is a terribly underspecified description. There are many ways that inference can be based upon simulations, many types of freedom from likelihood and many ways to approximate Bayesian computation. This page is about the dominant use of that term, which is the use of Simulation-based inference to do Bayes updates where the likelihood is not available but where we can simulate from the generative model.

Obviously there are other ways you can approximate Bayesian computation β€” see e.g. variational Bayes.

TBD: relationship between this and simulation-based inference in a frequentist setting, often called indirect inference. They look similar but tend not to cite each other. Is this a technical or sociological hurdle?

Miles Cranmer’s Introduction to Simulation-based inference.

SMC for ABC

One can solve for ABC using Sequential Monte Carlo. TBD.

Bayesian Synthetic Likelihood

TBD. Something about assuming the summary statistic is close to jointly Gaussian.

Neural methods

See the Mackelab sbi page for several:

Goal: Algorithmically identify mechanistic models which are consistent with data.

Each of the methods above needs three inputs: A candidate mechanistic model, prior knowledge or constraints on model parameters, and observational data (or summary statistics thereof).

The methods then proceed by

  1. sampling parameters from the prior followed by simulating synthetic data from these parameters,
  2. learning the (probabilistic) association between data (or data features) and underlying parameters, i.e. to learn statistical inference from simulated data. The way in which this association is learned differs between the above methods, but all use deep neural networks.
  3. This learned neural network is then applied to empirical data to derive the full space of parameters consistent with the data and the prior, i.e. the posterior distribution. High posterior probability is assigned to parameters which are consistent with both the data and the prior, low probability to inconsistent parameters. While SNPE directly learns the posterior distribution, SNLE and SNRE need an extra MCMC sampling step to construct a posterior.
  4. If needed, an initial estimate of the posterior can be used to adaptively generate additional informative simulations.

Code here: mackelab/sbi: Simulation-based inference in PyTorch

SBC

Simulation-based Bayes calibration. Is this the same thing?

Martin ModrΓ‘k, SBC Tutorial

Generalized Bayesian computation

A KL-free Bayesian computation extension. See generalized Bayes.

References

Alquier, Pierre. 2020. β€œApproximate Bayesian Inference.” Entropy 22 (11): 1272.
Baydin, AtΔ±lΔ±m GΓΌneş, Lei Shao, Wahid Bhimji, Lukas Heinrich, Lawrence Meadows, Jialin Liu, Andreas Munk, et al. 2019. β€œEtalumis: Bringing Probabilistic Programming to Scientific Simulators at Scale.” In arXiv:1907.03382 [Cs, Stat].
Beaumont, Mark A, Wenyang Zhang, and David J Balding. 2002. β€œApproximate Bayesian Computation in Population Genetics.” Genetics 162 (4): 2025–35.
Blum, Michael G. B., and Olivier FranΓ§ois. 2010. β€œNon-Linear Regression Models for Approximate Bayesian Computation.” Statistics and Computing 20 (1): 63–73.
Corenflos, Adrien, James Thornton, George Deligiannidis, and Arnaud Doucet. 2021. β€œDifferentiable Particle Filtering via Entropy-Regularized Optimal Transport.” arXiv:2102.07850 [Cs, Stat], June.
Cranmer, Kyle, Johann Brehmer, and Gilles Louppe. 2020. β€œThe Frontier of Simulation-Based Inference.” Proceedings of the National Academy of Sciences, May.
Cranmer, Kyle, Juan Pavez, and Gilles Louppe. 2015. β€œApproximating Likelihood Ratios with Calibrated Discriminative Classifiers,” June.
Diggle, Peter J., and Richard J. Gratton. 1984. β€œMonte Carlo Methods of Inference for Implicit Statistical Models.” Journal of the Royal Statistical Society: Series B (Methodological) 46 (2): 193–212.
Drovandi, Christopher C., Clara Grazian, Kerrie Mengersen, and Christian Robert. 2018. β€œApproximating the Likelihood in Approximate Bayesian Computation.” arXiv:1803.06645 [Stat], March.
Drovandi, Christopher, and David T. Frazier. 2021. β€œA Comparison of Likelihood-Free Methods With and Without Summary Statistics.” arXiv:2103.02407 [Stat], March.
Durkan, Conor, George Papamakarios, and Iain Murray. 2018. β€œSequential Neural Methods for Likelihood-Free Inference,” 9.
Fan, Yanan, David J. Nott, and Scott A. Sisson. 2013. β€œApproximate Bayesian Computation via Regression Density Estimation.” Stat 2 (1): 34–48.
Forneron, Jean-Jacques, and Serena Ng. 2015. β€œThe ABC of Simulation Estimation with Auxiliary Statistics.” arXiv:1501.01265 [Stat], January.
Frazier, David T., and Christopher Drovandi. 2021. β€œRobust Approximate Bayesian Inference With Synthetic Likelihood.” Journal of Computational and Graphical Statistics 0 (0): 1–19.
Frazier, David T., David J. Nott, Christopher Drovandi, and Robert Kohn. 2021. β€œBayesian Inference Using Synthetic Likelihood: Asymptotics and Adjustments.” arXiv:1902.04827 [Stat], March.
Gelman, Andrew, Aki Vehtari, Daniel Simpson, Charles C. Margossian, Bob Carpenter, Yuling Yao, Lauren Kennedy, Jonah Gabry, Paul-Christian BΓΌrkner, and Martin ModrΓ‘k. 2020. β€œBayesian Workflow.” arXiv:2011.01808 [Stat], November.
Gourieroux, Christian, and Alain Monfort. 1993. β€œSimulation-Based Inference: A Survey with Special Reference to Panel Data Models.” Journal of Econometrics 59 (1–2): 5–33.
Hermans, Joeri, Arnaud Delaunoy, FranΓ§ois Rozet, Antoine Wehenkel, Volodimir Begy, and Gilles Louppe. 2023. β€œA Crisis In Simulation-Based Inference? Beware, Your Posterior Approximations Can Be Unfaithful.” Transactions on Machine Learning Research, January.
Izbicki, Rafael, Ann B. Lee, and Taylor Pospisil. 2019. β€œABC–CDE: Toward Approximate Bayesian Computation With Complex High-Dimensional Data and Limited Simulations.” Journal of Computational and Graphical Statistics 28 (3): 481–92.
Le, Tuan Anh, AtΔ±lΔ±m GΓΌneş Baydin, and Frank Wood. 2017. β€œInference Compilation and Universal Probabilistic Programming.” In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS), 54:1338–48. Proceedings of Machine Learning Research. Fort Lauderdale, FL, USA: PMLR.
Lei, Jing, and Peter Bickel. 2009. β€œEnsemble Filtering for High Dimensional Nonlinear State Space Models.” University of California, Berkeley, Rep 779: 23.
Lueckmann, Jan-Matthis, Giacomo Bassetto, Theofanis Karaletsos, and Jakob H. Macke. 2019. β€œLikelihood-Free Inference with Emulator Networks.” In Symposium on Advances in Approximate Bayesian Inference, 32–53.
Meeds, Edward, and Max Welling. 2014. β€œGPS-ABC: Gaussian Process Surrogate Approximate Bayesian Computation.” In Proceedings of the Thirtieth Conference on Uncertainty in Artificial Intelligence, 593–602. UAI’14. Arlington, Virginia, USA: AUAI Press.
Mohamed, Shakir, and Balaji Lakshminarayanan. 2016. β€œLearning in Implicit Generative Models,” November.
Neal, Radford. 2008. β€œComputing Likelihood Functions for High-Energy Physics Experiments When Distributions Are Defined by Simulators with Nuisance Parameters.”
Nott, David J., Lucy Marshall, and Tran Minh Ngoc. 2012. β€œThe Ensemble Kalman Filter Is an ABC Algorithm.” Statistics and Computing 22 (6): 1273–76.
Ong, Victor M. -H., David J. Nott, Minh-Ngoc Tran, Scott A. Sisson, and Christopher C. Drovandi. 2018a. β€œLikelihood-Free Inference in High Dimensions with Synthetic Likelihood.” Computational Statistics & Data Analysis 128 (December): 271–91.
Ong, Victor M.-H., David J. Nott, Minh-Ngoc Tran, Scott A. Sisson, and Christopher C. Drovandi. 2018b. β€œVariational Bayes with Synthetic Likelihood.” Statistics and Computing 28 (4): 971–88.
Papamakarios, George, and Iain Murray. 2016. β€œFast Ξ΅-Free Inference of Simulation Models with Bayesian Conditional Density Estimation.” In Advances in Neural Information Processing Systems 29, edited by D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, and R. Garnett, 1028–36. Curran Associates, Inc.
Park, Mijung, Wittawat Jitkrittum, and Dino Sejdinovic. 2015. β€œK2-ABC: Approximate Bayesian Computation with Kernel Embeddings.” arXiv.
Rubin, Donald B. 1984. β€œBayesianly Justifiable and Relevant Frequency Calculations for the Applied Statistician.” Annals of Statistics 12 (4): 1151–72.
SΓ€ilynoja, Teemu, Paul-Christian BΓΌrkner, and Aki Vehtari. 2021. β€œGraphical Test for Discrete Uniformity and Its Applications in Goodness of Fit Evaluation and Multiple Sample Comparison.” arXiv:2103.10522 [Stat], March.
Schad, Daniel J., Michael Betancourt, and Shravan Vasishth. 2021. β€œToward a Principled Bayesian Workflow in Cognitive Science.” Psychological Methods 26 (1): 103–26.
Schmon, Sebastian M., Patrick W. Cannon, and Jeremias Knoblauch. 2021. β€œGeneralized Posteriors in Approximate Bayesian Computation.” arXiv:2011.08644 [Stat], February.
Sisson, S. A., Y. Fan, and Mark M. Tanaka. 2007. β€œSequential Monte Carlo Without Likelihoods.” Proceedings of the National Academy of Sciences 104 (6): 1760–65.
Sisson, Scott A., Yanan Fan, and Mark Beaumont. 2018. Handbook of Approximate Bayesian Computation. CRC Press.
Stoye, Markus, Johann Brehmer, Gilles Louppe, Juan Pavez, and Kyle Cranmer. 2018. β€œLikelihood-Free Inference with an Improved Cross-Entropy Estimator.” arXiv:1808.00973 [Hep-Ph, Physics:physics, Stat], August.
Talts, Sean, Michael Betancourt, Daniel Simpson, Aki Vehtari, and Andrew Gelman. 2020. β€œValidating Bayesian Inference Algorithms with Simulation-Based Calibration.” arXiv:1804.06788 [Stat], October.
Tran, Dustin, Rajesh Ranganath, and David Blei. 2017. β€œHierarchical Implicit Models and Likelihood-Free Variational Inference.” In Advances in Neural Information Processing Systems 30, edited by I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, 5523–33. Curran Associates, Inc.
Tran, Minh-Ngoc, David J. Nott, and Robert Kohn. 2017. β€œVariational Bayes With Intractable Likelihood.” Journal of Computational and Graphical Statistics 26 (4): 873–82.
Warne, David J., Thomas P. Prescott, Ruth E. Baker, and Matthew J. Simpson. 2021. β€œMultifidelity Multilevel Monte Carlo to Accelerate Approximate Bayesian Parameter Inference for Partially Observed Stochastic Processes.” arXiv:2110.14082 [q-Bio, Stat], October.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.